Compare commits

..

No commits in common. "main" and "refactor" have entirely different histories.

42 changed files with 1446 additions and 3408 deletions

2
.gitignore vendored
View file

@ -3,5 +3,5 @@
/media/*
/posts/*
!/posts/README.md
/cache
/.slbg-cache
/config.toml

View file

@ -1,51 +0,0 @@
# Building bingus-blog
this guide assumes you have git and are on linux.
at the moment, compiling on windows is supported, but not _for windows_.
1. first, acquire _rust nightly_.
the recommended method is to install [rustup](https://rustup.rs/),
and use that to get _rust nightly_. choose "customize installation",
and set "default toolchain" to nightly to save time later, provided
you do not need _rust stable_ for something else
2. start your favorite terminal
3. then, download the repository: `git clone https://git.slonk.ing/slonk/bingus-blog && cd bingus-blog`
4. finally, build the application: `cargo +nightly build --release`
5. your executable is `target/release/bingus-blog`, copy it to your server and
you're done!
## Building for another architecture
you can use the `--target` flag in `cargo build` for this purpose.
examples are for Arch Linux x86_64.
here's how to compile for `aarch64-unknown-linux-gnu`
(eg. Oracle CI Free Tier ARM VPS):
```sh
# install the required packages to compile and link aarch64 binaries
sudo pacman -S aarch64-linux-gnu-gcc
cargo +nightly build --release --target=aarch64-unknown-linux-gnu
```
your executable will be `target/aarch64-unkown-linux-gnu/release/bingus-blog`.
---
a more tricky example is building for `aarch64-unknown-linux-musl`
(eg. a Redmi 5 Plus running postmarketOS):
```sh
# there is no toolchain for aarch64-unknown-linux-musl,
# so we have to repurpose the GNU toolchain. this doesn't
# work out of the box so we have to set some environment variables
sudo pacman -S aarch64-linux-gnu-gcc
export CC=aarch64-linux-gnu-gcc
export CARGO_TARGET_AARCH64_UNKNOWN_LINUX_MUSL_LINKER=$CC
cargo +nightly build --release --target=aarch64-unknown-linux-musl
# the reason we had to do this is because cargo tries to use
# the same toolchain as the target's name. but we can tell it to use
# the GNU one like so.
```
your executable will be `target/aarch64-unkown-linux-musl/release/bingus-blog`.

View file

@ -1,82 +0,0 @@
# Configuration
the configuration format, with defaults, is documented below:
```toml
title = "bingus-blog" # title of the blog
# description of the blog
description = "blazingly fast blog software written in rust memory safe"
raw_access = true # allow users to see the raw source of a post
js_enable = true # enable javascript (required for sorting and dates)
engine = "markdown" # choose which post engine to use
# options: "markdown", "blag"
# absolutely do not use "blag" unless you know exactly
# what you are getting yourself into.
[style]
date_format = "RFC3339" # format string used to format dates in the backend
# it's highly recommended to leave this as default,
# so the date can be formatted by the browser.
# format: https://docs.rs/chrono/latest/chrono/format/strftime/index.html#specifiers
default_sort = "date" # default sorting method ("date" or "name")
#default_color = "#f5c2e7" # default embed color, optional
[style.display_dates]
creation = true # display creation ("written") dates
modification = true # display modified ("last modified") dates
[rss]
enable = false # serve an rss field under /feed.xml
# this may be a bit resource intensive
link = "https://..." # public url of the blog, required if rss is enabled
[dirs]
posts = "posts" # where posts are stored
media = "media" # directory served under /media/
custom_templates = "templates" # custom templates dir
custom_static = "static" # custom static dir
# see CUSTOM.md for documentation
[http]
host = "::" # ip to listen on
port = 3000 # port to listen on
[cache]
enable = true # save metadata and rendered posts into RAM
# highly recommended, only turn off if absolutely necessary
#ttl = 5 # how long should and item persist in cache,
# in milliseconds
# uncomment to enable
cleanup = true # clean cache, highly recommended
#cleanup_interval = 86400000 # clean the cache regularly instead of
# just at startup, value in milliseconds
# uncomment to enable
persistence = true # save the cache to on shutdown and load on startup
file = "cache" # file to save the cache to
compress = true # compress the cache file
compression_level = 3 # zstd compression level, 3 is recommended
[render]
escape = false # escape HTML in the markdown soucre instead of
# clobbering it (https://docs.rs/comrak/latest/comrak/struct.RenderOptions.html#structfield.escape)
unsafe = false # allow HTML and dangerous links (https://docs.rs/comrak/latest/comrak/struct.RenderOptions.html#structfield.unsafe_)
[render.syntect]
load_defaults = false # include default syntect themes
themes_dir = "themes" # directory to include themes from
theme = "Catppuccin Mocha" # theme file name (without `.tmTheme`)
[blag]
bin = "blag" # path to blag binary
```
configuration is done in [TOML](https://toml.io/)
if an option marked "optional" is not set, it will not be initialized with
a default value
you don't have to copy the whole thing from here,
it's generated by the program if it doesn't exist
## Specifying the configuration file
the configuration file is loaded from `config.toml` by default, but the path
can be overriden by setting the environment variable `BINGUS_BLOG_CONFIG`,
which will make bingus-blog try to read that file or fail and exit.

View file

@ -1,49 +0,0 @@
# Custom Content
bingus-blog supports loading custom content such as templates and static files
at runtime from custom locations.
the configuration options `dirs.custom_templates` and `dirs.custom_static`
allow you to set where these files are loaded from.
customizing the error page, other than CSS, is not supported at this time.
## Custom Templates
custom templates are written in
[Handlebars (the rust variant)](https://crates.io/crates/handlebars).
the *custom templates directory* has a non-recursive structure:
```md
./
- index.html # ignored
- index.hbs # loaded as `index`
- post.hbs # loaded as `post`
- [NAME].hbs # loaded as `[NAME]`
- ...
```
templates will be loaded from first, the executable, then, the custom
templates path, overriding the defaults.
template changes are also processed after startup, any changed template will be
compiled and will replace the existing template in the registry, or add a
new one (though that does nothing).
if a template is deleted, the default template will be recompiled into
it's place.
note that the watcher only works if the *custom templates directory* existed
at startup. if you delete/create the directory, you must restart the program.
## Custom Static Files
GET requests to `/static` will first be checked against `dirs.custom_static`.
if the file is not found in the *custom static directory*, bingus-blog will try
to serve it from the directory embedded in the executable. this means you can
add whatever you want in the *custom static directory* and it will be served
under `/static`.
## Custom Media
the endpoint `/media` is served from `dirs.media`. no other logic or mechanism
is present.

1780
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -3,7 +3,6 @@ name = "bingus-blog"
version = "0.1.0"
edition = "2021"
default-run = "bingus-blog"
repository = "https://git.slonk.ing/slonk/bingus-blog"
[features]
default = []
@ -33,35 +32,26 @@ chrono = { version = "0.4.37", features = [
"serde",
], default-features = false }
color-eyre = "0.6.3"
comrak = { version = "0.32.0", features = [
comrak = { version = "0.22.0", features = [
"syntect",
], default-features = false }
console-subscriber = { version = "0.4.1", optional = true }
console-subscriber = { version = "0.2.0", optional = true }
derive_more = "0.99.17"
fronma = "0.2.0"
futures = "0.3.31"
handlebars = "6.0.0"
include_dir = "0.7.4"
indexmap = { version = "2.7.0", features = ["serde"] }
mime_guess = "2.0.5"
notify-debouncer-full = { version = "0.4.0", default-features = false }
rss = "2.0.7"
scc = { version = "2.1.0", features = ["serde"] }
serde = { version = "1.0.197", features = ["derive", "rc"] }
serde-value = "0.7.0"
serde_json = { version = "1.0.124", features = ["preserve_order"] }
serde = { version = "1.0.197", features = ["derive"] }
syntect = "5.2.0"
thiserror = "2.0.9"
thiserror = "1.0.58"
tokio = { version = "1.37.0", features = [
"fs",
"macros",
"rt-multi-thread",
"signal",
"process",
] }
tokio-util = { version = "0.7.10", default-features = false }
toml = "0.8.12"
tower = { version = "0.5.2", features = ["util"] }
tower-http = { version = "0.6.2", features = [
tower-http = { version = "0.5.2", features = [
"compression-gzip",
"fs",
"trace",

161
README.md
View file

@ -1,47 +1,85 @@
---
title: README
description: the README.md file of this project
author: slonkazoid
written_at: 2024-04-18T04:15:26+03:00
title: "README"
description: "the README.md file of this project"
author: "slonkazoid"
created_at: 2024-04-18T04:15:26+03:00
---
# bingus-blog
blazingly fast markdown blog software written in rust memory safe
for bingus-blog viewers: [see original document](https://git.slonk.ing/slonk/bingus-blog)
## Features
- posts are written in markdwon and loaded at runtime, meaning you
can write posts from anywhere and sync it with the server without headache
- RSS is supported
- the look of the blog is extremely customizable, with support for
[custom drop-ins](CUSTOM.md) for both templates and static content
- really easy to deploy (the server is one executable file)
- blazingly fast
## TODO
- [ ] blog thumbnail and favicon
- [ ] sort asc/desc
- [x] RSS
- [x] finish writing this document
- [x] document config
- [ ] extend syntect options
- [x] ^ fix syntect mutex poisoning
- [ ] general cleanup of code
- [ ] better error reporting and error pages
- [ ] better tracing
- [ ] replace HashMap with HashCache once i implement [this](https://github.com/wvwwvwwv/scalable-concurrent-containers/issues/139)
- [x] cache cleanup task
- [ ] ^ replace HashMap with HashCache once i implement [this](https://github.com/wvwwvwwv/scalable-concurrent-containers/issues/139)
- [x] (de)compress cache with zstd on startup/shutdown
- [ ] make date parsing less strict
- [x] improve home page
- [x] make date formatting better
- [ ] date formatting respects user timezone
- [x] clean up imports and require less features
- [ ] improve home page
- [x] tags
- [ ] multi-language support
- [x] add credits
- [x] be blazingly fast
- [x] 100+ MiB binary size
## Configuration
see [CONFIG.md](CONFIG.md)
the default configuration with comments looks like this
## Building
```toml
title = "bingus-blog" # title of the blog
# description of the blog
description = "blazingly fast markdown blog software written in rust memory safe"
markdown_access = true # allow users to see the raw markdown of a post
# endpoint: /posts/<name>.md
date_format = "RFC3339" # format string used to format dates in the backend
# it's highly recommended to leave this as default,
# so the date can be formatted by the browser.
# format: https://docs.rs/chrono/latest/chrono/format/strftime/index.html#specifiers
js_enable = true # enable javascript (required for above)
[rss]
enable = false # serve an rss field under /feed.xml
# this may be a bit resource intensive
link = "https://..." # public url of the blog, required if rss is enabled
[dirs]
posts = "posts" # where posts are stored
media = "media" # directory served under /media/
[http]
host = "0.0.0.0" # ip to listen on
port = 3000 # port to listen on
[cache]
enable = true # save metadata and rendered posts into RAM
# highly recommended, only turn off if absolutely necessary
cleanup = true # clean cache, highly recommended
#cleanup_interval = 86400000 # clean the cache regularly instead of just at startup
# uncomment to enable
persistence = true # save the cache to on shutdown and load on startup
file = "cache" # file to save the cache to
compress = true # compress the cache file
compression_level = 3 # zstd compression level, 3 is recommended
[render]
syntect.load_defaults = false # include default syntect themes
syntect.themes_dir = "themes" # directory to include themes from
syntect.theme = "Catppuccin Mocha" # theme file name (without `.tmTheme`)
```
you don't have to copy it from here, it's generated if it doesn't exist
## Usage
this project uses nightly-only features.
make sure you have the nightly toolchain installed.
@ -54,7 +92,21 @@ cargo +nightly build --release
the executable will be located at `target/release/bingus-blog`.
see [BUILDING.md](BUILDING.md) for more information and detailed instructions.
### Building for another architecture
you can use the `--target` flag in `cargo build` for this purpose
building for `aarch64-unknown-linux-musl` (for example, a Redmi 5 Plus running postmarketOS):
```sh
# install the required packages to compile and link aarch64 binaries
sudo pacman -S aarch64-linux-gnu-gcc
export CC=aarch64-linux-gnu-gcc
export CARGO_TARGET_AARCH64_UNKNOWN_LINUX_MUSL_LINKER=$CC
cargo +nightly build --release --target=aarch64-unknown-linux-musl
```
your executable will be located at `target/<target>/release/bingus-blog` this time.
## Writing Posts
@ -73,28 +125,21 @@ every post **must** begin with a **valid** front matter. else it wont be listed
in / & /posts, and when you navigate to it, you will be met with an error page.
the error page will tell you what the problem is.
full example:
example:
```md
---
title: My first post # title of the post
description: The first post on this awesome blog! # short description of the post
author: Blubber256 # author of the post
icon: /media/first-post/icon.png # icon/thumbnail of post used in embeds
icon_alt: Picture of a computer running DOOM
color: "#00aacc" # color of post, also used in embeds
written_at: 2024-04-18T04:15:26+03:00 # date of writing, this is highly
# recommended if you are on a system which doesnt have btime (like musl),
# because this is fetched from file stats by default
#modified_at: ... # see above. this is also fetched from the filesystem
tags: # tags, or keywords, used in meta and also in the ui
- lifestyle
title: "README"
description: "the README.md file of this project"
author: "slonkazoid"
created_at: 2024-04-18T04:15:26+03:00
#modified_at: ... # see above
---
```
only first 3 fields are required. if it can't find the other 2 fields, it will
get them from filesystem metadata. if you are on musl and you omit the
`written_at` field, it will just not show up
`created_at` field, it will just not show up
the dates must follow the [RFC 3339](https://datatracker.ietf.org/doc/html/rfc3339)
standard. examples of valid and invalid dates:
@ -107,43 +152,25 @@ standard. examples of valid and invalid dates:
- # everything else is also invalid
```
## Non-static Routes
## Routes
- `GET /`: index page, lists posts
- `GET /posts`: returns a list of all posts with metadata in JSON format
- `GET /posts/<name>`: view a post
- `GET /posts/<name>.md`: view the raw markdown of a post
- `GET /post/*`: redirects to `/posts/*`
- `GET /feed.xml`: RSS feed
## Cache
bingus-blog caches every post retrieved and keeps it permanently in cache.
there is a toggleable cleanup task that periodically sweeps the cache to
remove dead entries, but it can still get quite big.
the only way a cache entry is removed is when it's requested and it does
not exist in the filesystem. cache entries don't expire, but they get
invalidated when the mtime of the markdown file changes.
if cache persistence is on, the cache is (compressed &) written to disk on
shutdown, and read (& decompressed) on startup. one may opt to set the cache
location to point to a tmpfs to make it save and load quickly, but not persist
across reboots at the cost of more RAM usage.
if cache persistence is on, the cache is compressed & written on shutdown,
and read & decompressed on startup. one may opt to set the cache location
to point to a tmpfs so it saves and loads really fast, but it doesn't persist
across boots, also at the cost of even more RAM usage.
in my testing, the compression reduced a 3.21 MB cache to 0.18 MB almost
instantly. there is basically no good reason to not have compression on,
unless you have filesystem compression already of course.
## Contributing
make sure your changes don't break firefox, chromium,text-based browsers,
and webkit support
### Feature Requests
i want this project to be a good and usable piece of software, so i implement
feature requests provided they fit the project and it's values.
most just ping me on discord with feature requests, but if your request is
non-trivial, please create an issue [here](https://git.slonk.ing/slonk/bingus-blog/issues).
## Blagging
you've scrolled this far. you deserve to know [the truth](https://git.slonk.ing/slonk/blag).
the compression reduced a 3.21 MB file cache into 0.18 MB with almost instantly.
there is basically no good reason to not have compression on.

View file

@ -1,76 +1,49 @@
use std::sync::Arc;
use std::time::Duration;
use askama_axum::Template;
use axum::extract::{Path, Query, State};
use axum::http::header::CONTENT_TYPE;
use axum::http::Request;
use axum::response::{Html, IntoResponse, Redirect, Response};
use axum::http::{header, Request};
use axum::response::{IntoResponse, Redirect, Response};
use axum::routing::get;
use axum::{Json, Router};
use handlebars::Handlebars;
use include_dir::{include_dir, Dir};
use indexmap::IndexMap;
use rss::{Category, ChannelBuilder, ItemBuilder};
use serde::{Deserialize, Serialize};
use serde_value::Value;
use tokio::sync::RwLock;
use tower::service_fn;
use serde::Deserialize;
use tower_http::services::ServeDir;
use tower_http::trace::TraceLayer;
use tracing::{info, info_span, Span};
use crate::config::{Config, StyleConfig};
use crate::config::{Config, DateFormat};
use crate::error::{AppError, AppResult};
use crate::post::{Filter, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::serve_dir_included::handle;
const STATIC: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/static");
#[derive(Serialize)]
pub struct BingusInfo {
pub name: &'static str,
pub version: &'static str,
pub repository: &'static str,
}
const BINGUS_INFO: BingusInfo = BingusInfo {
name: env!("CARGO_PKG_NAME"),
version: env!("CARGO_PKG_VERSION"),
repository: env!("CARGO_PKG_REPOSITORY"),
};
use crate::filters;
use crate::post::{MarkdownPosts, PostManager, PostMetadata, RenderStats, ReturnedPost};
#[derive(Clone)]
#[non_exhaustive]
pub struct AppState {
pub config: Arc<Config>,
pub posts: Arc<dyn PostManager + Send + Sync>,
pub templates: Arc<RwLock<Handlebars<'static>>>,
pub posts: Arc<MarkdownPosts<Arc<Config>>>,
}
#[derive(Serialize)]
struct IndexTemplate<'a> {
bingus_info: &'a BingusInfo,
title: &'a str,
description: &'a str,
#[derive(Template)]
#[template(path = "index.html")]
struct IndexTemplate {
title: String,
description: String,
posts: Vec<PostMetadata>,
rss: bool,
df: DateFormat,
js: bool,
tags: IndexMap<Arc<str>, u64>,
joined_tags: String,
style: &'a StyleConfig,
}
#[derive(Serialize)]
struct PostTemplate<'a> {
bingus_info: &'a BingusInfo,
meta: &'a PostMetadata,
rendered: Arc<str>,
#[derive(Template)]
#[template(path = "post.html")]
struct PostTemplate {
meta: PostMetadata,
rendered: String,
rendered_in: RenderStats,
markdown_access: bool,
df: DateFormat,
js: bool,
color: Option<&'a str>,
joined_tags: String,
style: &'a StyleConfig,
raw_name: Option<String>,
}
#[derive(Deserialize)]
@ -78,79 +51,23 @@ struct QueryParams {
tag: Option<String>,
#[serde(rename = "n")]
num_posts: Option<usize>,
#[serde(flatten)]
other: IndexMap<String, Value>,
}
fn collect_tags(posts: &Vec<PostMetadata>) -> IndexMap<Arc<str>, u64> {
let mut tags = IndexMap::new();
for post in posts {
for tag in &post.tags {
if let Some((existing_tag, count)) = tags.swap_remove_entry(tag) {
tags.insert(existing_tag, count + 1);
} else {
tags.insert(tag.clone(), 1);
}
}
}
tags.sort_unstable_by(|k1, _v1, k2, _v2| k1.cmp(k2));
tags.sort_by(|_k1, v1, _k2, v2| v2.cmp(v1));
tags
}
fn join_tags_for_meta(tags: &IndexMap<Arc<str>, u64>, delim: &str) -> String {
let mut s = String::new();
let tags = tags.keys().enumerate();
let len = tags.len();
for (i, tag) in tags {
s += tag;
if i != len - 1 {
s += delim;
}
}
s
}
async fn index(
State(AppState {
config,
posts,
templates: reg,
..
}): State<AppState>,
State(AppState { config, posts }): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<impl IntoResponse> {
) -> AppResult<IndexTemplate> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(
query.num_posts,
query.tag.as_deref(),
&query.other,
)
.get_max_n_post_metadata_with_optional_tag_sorted(query.num_posts, query.tag.as_ref())
.await?;
let tags = collect_tags(&posts);
let joined_tags = join_tags_for_meta(&tags, ", ");
let reg = reg.read().await;
let rendered = reg.render(
"index",
&IndexTemplate {
title: &config.title,
description: &config.description,
bingus_info: &BINGUS_INFO,
posts,
rss: config.rss.enable,
js: config.js_enable,
tags,
joined_tags,
style: &config.style,
},
);
drop(reg);
Ok(Html(rendered?))
Ok(IndexTemplate {
title: config.title.clone(),
description: config.description.clone(),
posts,
df: config.date_format.clone(),
js: config.js_enable,
})
}
async fn all_posts(
@ -158,18 +75,14 @@ async fn all_posts(
Query(query): Query<QueryParams>,
) -> AppResult<Json<Vec<PostMetadata>>> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(
query.num_posts,
query.tag.as_deref(),
&query.other,
)
.get_max_n_post_metadata_with_optional_tag_sorted(query.num_posts, query.tag.as_ref())
.await?;
Ok(Json(posts))
}
async fn rss(
State(AppState { config, posts, .. }): State<AppState>,
State(AppState { config, posts }): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<Response> {
if !config.rss.enable {
@ -177,14 +90,12 @@ async fn rss(
}
let posts = posts
.get_all_posts(
query
.get_all_posts(|metadata, _| {
!query
.tag
.as_ref()
.and(Some(Filter::Tags(query.tag.as_deref().as_slice())))
.as_slice(),
&query.other,
)
.is_some_and(|tag| !metadata.tags.contains(tag))
})
.await?;
let mut channel = ChannelBuilder::default();
@ -197,21 +108,21 @@ async fn rss(
for (metadata, content, _) in posts {
channel.item(
ItemBuilder::default()
.title(metadata.title.to_string())
.description(metadata.description.to_string())
.author(metadata.author.to_string())
.title(metadata.title)
.description(metadata.description)
.author(metadata.author)
.categories(
metadata
.tags
.into_iter()
.map(|tag| Category {
name: tag.to_string(),
name: tag,
domain: None,
})
.collect::<Vec<Category>>(),
)
.pub_date(metadata.written_at.map(|date| date.to_rfc2822()))
.content(content.to_string())
.pub_date(metadata.created_at.map(|date| date.to_rfc2822()))
.content(content)
.link(
config
.rss
@ -226,55 +137,29 @@ async fn rss(
let body = channel.build().to_string();
drop(channel);
Ok(([(CONTENT_TYPE, "text/xml")], body).into_response())
Ok(([(header::CONTENT_TYPE, "text/xml")], body).into_response())
}
async fn post(
State(AppState {
config,
posts,
templates: reg,
..
}): State<AppState>,
Path(name): Path<Arc<str>>,
Query(query): Query<QueryParams>,
) -> AppResult<impl IntoResponse> {
match posts.get_post(name.clone(), &query.other).await? {
ReturnedPost::Rendered {
ref meta,
body: rendered,
perf: rendered_in,
} => {
let joined_tags = meta.tags.join(", ");
State(AppState { config, posts }): State<AppState>,
Path(name): Path<String>,
) -> AppResult<Response> {
match posts.get_post(&name).await? {
ReturnedPost::Rendered(meta, rendered, rendered_in) => {
let page = PostTemplate {
meta,
rendered,
rendered_in,
markdown_access: config.markdown_access,
df: config.date_format.clone(),
js: config.js_enable,
};
let reg = reg.read().await;
let rendered = reg.render(
"post",
&PostTemplate {
bingus_info: &BINGUS_INFO,
meta,
rendered,
rendered_in,
js: config.js_enable,
color: meta
.color
.as_deref()
.or(config.style.default_color.as_deref()),
joined_tags,
style: &config.style,
raw_name: config
.markdown_access
.then(|| posts.as_raw(&meta.name))
.unwrap_or(None),
},
);
drop(reg);
Ok(Html(rendered?).into_response())
Ok(page.into_response())
}
ReturnedPost::Raw(body, content_type) => {
Ok(([(CONTENT_TYPE, content_type)], body).into_response())
}
ReturnedPost::Raw {
buffer,
content_type,
} => Ok(([(CONTENT_TYPE, content_type)], buffer).into_response()),
}
}
@ -292,9 +177,7 @@ pub fn new(config: &Config) -> Router<AppState> {
.route("/feed.xml", get(rss))
.nest_service(
"/static",
ServeDir::new(&config.dirs.custom_static)
.precompressed_gzip()
.fallback(service_fn(|req| handle(req, &STATIC))),
ServeDir::new(&config.dirs._static).precompressed_gzip(),
)
.nest_service("/media", ServeDir::new(&config.dirs.media))
.layer(

View file

@ -1,15 +1,14 @@
use std::env;
use std::net::{IpAddr, Ipv6Addr};
use std::num::NonZeroU64;
use std::path::PathBuf;
use color_eyre::eyre::{self, bail, Context};
use color_eyre::eyre::{bail, Context, Result};
use serde::{Deserialize, Serialize};
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tracing::{error, info, instrument};
use url::Url;
use crate::de::*;
use crate::ranged_i128_visitor::RangedI128Visitor;
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Hash)]
#[serde(default)]
@ -23,20 +22,14 @@ pub struct SyntectConfig {
#[serde(default)]
pub struct RenderConfig {
pub syntect: SyntectConfig,
pub escape: bool,
#[serde(rename = "unsafe")]
pub unsafe_: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct CacheConfig {
pub enable: bool,
#[serde(deserialize_with = "check_millis")]
pub ttl: Option<NonZeroU64>,
pub cleanup: bool,
#[serde(deserialize_with = "check_millis")]
pub cleanup_interval: Option<NonZeroU64>,
pub cleanup_interval: Option<u64>,
pub persistence: bool,
pub file: PathBuf,
pub compress: bool,
@ -56,8 +49,8 @@ pub struct HttpConfig {
pub struct DirsConfig {
pub posts: PathBuf,
pub media: PathBuf,
pub custom_static: PathBuf,
pub custom_templates: PathBuf,
#[serde(rename = "static")]
pub _static: PathBuf,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
@ -74,61 +67,19 @@ pub enum DateFormat {
Strftime(String),
}
#[derive(Serialize, Deserialize, Debug, Clone, Default, Copy, PartialEq, Eq)]
#[serde(rename_all = "lowercase")]
#[repr(u8)]
pub enum Sort {
#[default]
Date,
Name,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
#[derive(Default)]
pub struct StyleConfig {
pub display_dates: DisplayDates,
pub date_format: DateFormat,
pub default_sort: Sort,
pub default_color: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct DisplayDates {
pub creation: bool,
pub modification: bool,
}
#[derive(Serialize, Deserialize, Default, Debug, Clone)]
#[serde(rename_all = "lowercase")]
pub enum Engine {
#[default]
Markdown,
Blag,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct BlagConfig {
pub bin: PathBuf,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct Config {
pub title: String,
pub description: String,
pub markdown_access: bool,
pub date_format: DateFormat,
pub js_enable: bool,
pub engine: Engine,
pub style: StyleConfig,
pub rss: RssConfig,
pub dirs: DirsConfig,
pub http: HttpConfig,
pub render: RenderConfig,
pub cache: CacheConfig,
pub blag: BlagConfig,
}
impl Default for Config {
@ -137,9 +88,8 @@ impl Default for Config {
title: "bingus-blog".into(),
description: "blazingly fast markdown blog software written in rust memory safe".into(),
markdown_access: true,
date_format: Default::default(),
js_enable: true,
engine: Default::default(),
style: Default::default(),
// i have a love-hate relationship with serde
// it was engimatic at first, but then i started actually using it
// writing my own serialize and deserialize implementations.. spending
@ -154,16 +104,6 @@ impl Default for Config {
http: Default::default(),
render: Default::default(),
cache: Default::default(),
blag: Default::default(),
}
}
}
impl Default for DisplayDates {
fn default() -> Self {
Self {
creation: true,
modification: true,
}
}
}
@ -173,8 +113,7 @@ impl Default for DirsConfig {
Self {
posts: "posts".into(),
media: "media".into(),
custom_static: "static".into(),
custom_templates: "templates".into(),
_static: "static".into(),
}
}
}
@ -202,7 +141,6 @@ impl Default for CacheConfig {
fn default() -> Self {
Self {
enable: true,
ttl: None,
cleanup: true,
cleanup_interval: None,
persistence: true,
@ -213,19 +151,13 @@ impl Default for CacheConfig {
}
}
impl Default for BlagConfig {
fn default() -> Self {
Self { bin: "blag".into() }
}
}
#[instrument(name = "config")]
pub async fn load() -> eyre::Result<Config> {
pub async fn load() -> Result<Config> {
let config_file = env::var(format!(
"{}_CONFIG",
env!("CARGO_BIN_NAME").to_uppercase().replace('-', "_")
env!("CARGO_BIN_NAME").to_uppercase().replace("-", "_")
))
.unwrap_or_else(|_| String::from("config.toml"));
.unwrap_or(String::from("config.toml"));
match tokio::fs::OpenOptions::new()
.read(true)
.open(&config_file)
@ -272,13 +204,6 @@ fn check_zstd_level_bounds<'de, D>(d: D) -> Result<i32, D::Error>
where
D: serde::Deserializer<'de>,
{
d.deserialize_i32(RangedI64Visitor::<1, 22>)
d.deserialize_i32(RangedI128Visitor::<1, 22>)
.map(|x| x as i32)
}
fn check_millis<'de, D>(d: D) -> Result<Option<NonZeroU64>, D::Error>
where
D: serde::Deserializer<'de>,
{
d.deserialize_option(MillisVisitor)
}

View file

@ -1,86 +0,0 @@
use std::num::NonZeroU64;
use serde::de::Error;
use serde::{
de::{Unexpected, Visitor},
Deserializer,
};
pub struct RangedI64Visitor<const START: i64, const END: i64>;
impl<const START: i64, const END: i64> serde::de::Visitor<'_> for RangedI64Visitor<START, END> {
type Value = i64;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "an integer between {START} and {END}")
}
fn visit_i32<E>(self, v: i32) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i64(v as i64)
}
fn visit_i64<E>(self, v: i64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
if v >= START && v <= END {
Ok(v)
} else {
Err(E::custom(format!(
"integer is out of bounds ({START}..{END})"
)))
}
}
fn visit_i128<E>(self, v: i128) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i64(v as i64)
}
}
pub struct U64Visitor;
impl Visitor<'_> for U64Visitor {
type Value = u64;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "a non-negative integer")
}
fn visit_u64<E>(self, v: u64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(v)
}
fn visit_i64<E>(self, v: i64) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
u64::try_from(v).map_err(|_| E::invalid_value(Unexpected::Signed(v), &self))
}
}
pub struct MillisVisitor;
impl<'de> Visitor<'de> for MillisVisitor {
type Value = Option<NonZeroU64>;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "a positive integer")
}
fn visit_some<D>(self, deserializer: D) -> Result<Self::Value, D::Error>
where
D: Deserializer<'de>,
{
let n = deserializer.deserialize_i64(U64Visitor)?;
NonZeroU64::new(n)
.ok_or(D::Error::invalid_value(Unexpected::Unsigned(n), &self))
.map(Some)
}
}

View file

@ -1,47 +1,43 @@
use std::sync::Arc;
use std::fmt::Display;
use askama_axum::Template;
use axum::http::StatusCode;
use axum::response::{IntoResponse, Response};
use color_eyre::eyre;
use thiserror::Error;
use tracing::error;
#[derive(Debug)]
#[repr(transparent)]
pub struct FronmaError(fronma::error::Error);
impl std::error::Error for FronmaError {}
impl Display for FronmaError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str("failed to parse front matter: ")?;
match &self.0 {
fronma::error::Error::MissingBeginningLine => f.write_str("missing beginning line"),
fronma::error::Error::MissingEndingLine => f.write_str("missing ending line"),
fronma::error::Error::SerdeYaml(yaml_error) => write!(f, "{}", yaml_error),
}
}
}
#[derive(Error, Debug)]
#[allow(clippy::enum_variant_names)]
pub enum PostError {
#[error("io error: {0}")]
#[error(transparent)]
IoError(#[from] std::io::Error),
#[error("failed to parse post metadata: {0}")]
ParseError(String),
#[error("failed to render post: {0}")]
RenderError(String),
#[error(transparent)]
AskamaError(#[from] askama::Error),
#[error(transparent)]
ParseError(#[from] FronmaError),
#[error("post {0:?} not found")]
NotFound(Arc<str>),
#[error("unexpected: {0}")]
Other(#[from] eyre::Error),
NotFound(String),
}
impl From<fronma::error::Error> for PostError {
fn from(value: fronma::error::Error) -> Self {
let binding;
Self::ParseError(format!(
"failed to parse front matter: {}",
match value {
fronma::error::Error::MissingBeginningLine => "missing beginning line",
fronma::error::Error::MissingEndingLine => "missing ending line",
fronma::error::Error::SerdeYaml(yaml_error) => {
binding = yaml_error.to_string();
&binding
}
}
))
}
}
impl From<serde_json::Error> for PostError {
fn from(value: serde_json::Error) -> Self {
Self::ParseError(value.to_string())
Self::ParseError(FronmaError(value))
}
}
@ -57,8 +53,6 @@ pub type AppResult<T> = Result<T, AppError>;
pub enum AppError {
#[error("failed to fetch post: {0}")]
PostError(#[from] PostError),
#[error(transparent)]
HandlebarsError(#[from] handlebars::RenderError),
#[error("rss is disabled")]
RssDisabled,
#[error(transparent)]
@ -80,14 +74,20 @@ struct ErrorTemplate {
impl IntoResponse for AppError {
fn into_response(self) -> Response {
let error = self.to_string();
error!("error while handling request: {error}");
let status_code = match &self {
AppError::PostError(PostError::NotFound(_)) => StatusCode::NOT_FOUND,
AppError::PostError(err) => match err {
PostError::NotFound(_) => StatusCode::NOT_FOUND,
_ => StatusCode::INTERNAL_SERVER_ERROR,
},
AppError::RssDisabled => StatusCode::FORBIDDEN,
_ => StatusCode::INTERNAL_SERVER_ERROR,
AppError::UrlError(_) => StatusCode::INTERNAL_SERVER_ERROR,
};
(status_code, ErrorTemplate { error }).into_response()
(
status_code,
ErrorTemplate {
error: self.to_string(),
},
)
.into_response()
}
}

52
src/filters.rs Normal file
View file

@ -0,0 +1,52 @@
use std::collections::HashMap;
use std::fmt::Display;
use std::time::Duration;
use chrono::{DateTime, TimeZone};
use crate::config::DateFormat;
use crate::post::PostMetadata;
fn format_date<T>(date: &DateTime<T>, date_format: &DateFormat) -> String
where
T: TimeZone,
T::Offset: Display,
{
match date_format {
DateFormat::RFC3339 => date.to_rfc3339_opts(chrono::SecondsFormat::Secs, true),
DateFormat::Strftime(ref format_string) => date.format(format_string).to_string(),
}
}
pub fn date<T>(date: &DateTime<T>, date_format: &DateFormat) -> Result<String, askama::Error>
where
T: TimeZone,
T::Offset: Display,
{
Ok(format_date(date, date_format))
}
pub fn duration(duration: &&Duration) -> Result<String, askama::Error> {
Ok(format!("{:?}", duration))
}
pub fn collect_tags(posts: &Vec<PostMetadata>) -> Result<Vec<(String, u64)>, askama::Error> {
let mut tags = HashMap::new();
for post in posts {
for tag in &post.tags {
if let Some((existing_tag, count)) = tags.remove_entry(tag) {
tags.insert(existing_tag, count + 1);
} else {
tags.insert(tag.clone(), 1);
}
}
}
let mut tags: Vec<(String, u64)> = tags.into_iter().collect();
tags.sort_unstable_by_key(|(v, _)| v.clone());
tags.sort_by_key(|(_, v)| -(*v as i64));
Ok(tags)
}

37
src/hash_arc_store.rs Normal file
View file

@ -0,0 +1,37 @@
use std::hash::{DefaultHasher, Hash, Hasher};
use std::marker::PhantomData;
use std::sync::Arc;
pub struct HashArcStore<T, Lookup>
where
Lookup: Hash,
{
inner: Option<Arc<T>>,
hash: Option<u64>,
_phantom: PhantomData<Lookup>,
}
impl<T, Lookup> HashArcStore<T, Lookup>
where
Lookup: Hash,
{
pub fn new() -> Self {
Self {
inner: None,
hash: None,
_phantom: PhantomData,
}
}
pub fn get_or_init(&mut self, key: &Lookup, init: impl Fn(&Lookup) -> Arc<T>) -> Arc<T> {
let mut h = DefaultHasher::new();
key.hash(&mut h);
let hash = h.finish();
if !self.hash.is_some_and(|inner_hash| inner_hash == hash) {
self.inner = Some(init(key));
self.hash = Some(hash);
}
// safety: please.
unsafe { self.inner.as_ref().unwrap_unchecked().clone() }
}
}

View file

@ -1,24 +0,0 @@
use std::fmt::Display;
use std::time::Duration;
use chrono::{DateTime, TimeZone, Utc};
use handlebars::handlebars_helper;
use crate::config::DateFormat;
fn date_impl<T>(date_time: &DateTime<T>, date_format: &DateFormat) -> String
where
T: TimeZone,
T::Offset: Display,
{
match date_format {
DateFormat::RFC3339 => date_time.to_rfc3339_opts(chrono::SecondsFormat::Secs, true),
DateFormat::Strftime(ref format_string) => date_time.format(format_string).to_string(),
}
}
handlebars_helper!(date: |date_time: Option<DateTime<Utc>>, date_format: DateFormat| {
date_impl(date_time.as_ref().unwrap(), &date_format)
});
handlebars_helper!(duration: |duration_: Duration| format!("{:?}", duration_));

View file

@ -1,16 +1,14 @@
#![feature(let_chains, pattern, path_add_extension)]
#![feature(let_chains)]
mod app;
mod config;
mod de;
mod error;
mod helpers;
mod filters;
mod hash_arc_store;
mod markdown_render;
mod platform;
mod post;
mod serve_dir_included;
mod ranged_i128_visitor;
mod systemtime_as_secs;
mod templates;
use std::future::IntoFuture;
use std::net::SocketAddr;
@ -19,37 +17,32 @@ use std::sync::Arc;
use std::time::Duration;
use color_eyre::eyre::{self, Context};
use config::Engine;
use tokio::net::TcpListener;
use tokio::sync::RwLock;
use tokio::task::JoinSet;
use tokio::time::Instant;
use tokio::{select, signal};
use tokio_util::sync::CancellationToken;
use tracing::level_filters::LevelFilter;
use tracing::{debug, error, info, info_span, warn, Instrument};
use tracing::{debug, info, warn};
use tracing_subscriber::layer::SubscriberExt;
use tracing_subscriber::{util::SubscriberInitExt, EnvFilter};
use crate::app::AppState;
use crate::post::cache::{load_cache, Cache, CacheGuard, CACHE_VERSION};
use crate::post::{Blag, MarkdownPosts, PostManager};
use crate::templates::new_registry;
use crate::templates::watcher::watch_templates;
use crate::post::{MarkdownPosts, PostManager};
#[tokio::main]
async fn main() -> eyre::Result<()> {
color_eyre::install()?;
let reg = tracing_subscriber::registry();
#[cfg(feature = "tokio-console")]
let reg = reg.with(console_subscriber::spawn());
console_subscriber::init();
color_eyre::install()?;
#[cfg(not(feature = "tokio-console"))]
let reg = reg.with(
EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy(),
);
reg.with(tracing_subscriber::fmt::layer()).init();
tracing_subscriber::registry()
.with(
EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy(),
)
.with(tracing_subscriber::fmt::layer())
.init();
let config = Arc::new(
config::load()
@ -62,70 +55,22 @@ async fn main() -> eyre::Result<()> {
let mut tasks = JoinSet::new();
let cancellation_token = CancellationToken::new();
let start = Instant::now();
// NOTE: use tokio::task::spawn_blocking if this ever turns into a concurrent task
let mut reg = new_registry(&config.dirs.custom_templates)
.context("failed to create handlebars registry")?;
reg.register_helper("date", Box::new(helpers::date));
reg.register_helper("duration", Box::new(helpers::duration));
debug!(duration = ?start.elapsed(), "registered all templates");
let registry = Arc::new(RwLock::new(reg));
debug!("setting up watcher");
let watcher_token = cancellation_token.child_token();
tasks.spawn(
watch_templates(
config.dirs.custom_templates.clone(),
watcher_token.clone(),
registry.clone(),
)
.instrument(info_span!("custom_template_watcher")),
);
let cache = if config.cache.enable {
if config.cache.persistence && tokio::fs::try_exists(&config.cache.file).await? {
info!("loading cache from file");
let mut cache = load_cache(&config.cache).await.unwrap_or_else(|err| {
error!("failed to load cache: {}", err);
info!("using empty cache");
Cache::new(config.cache.ttl)
});
if cache.version() < CACHE_VERSION {
warn!("cache version changed, clearing cache");
cache = Cache::new(config.cache.ttl);
};
Some(cache)
} else {
Some(Cache::new(config.cache.ttl))
}
} else {
None
}
.map(|cache| CacheGuard::new(cache, config.cache.clone()))
.map(Arc::new);
let posts: Arc<dyn PostManager + Send + Sync> = match config.engine {
Engine::Markdown => Arc::new(MarkdownPosts::new(Arc::clone(&config), cache.clone()).await?),
Engine::Blag => Arc::new(Blag::new(
config.dirs.posts.clone().into(),
config.blag.bin.clone().into(),
cache.clone(),
)),
let posts = Arc::new(MarkdownPosts::new(Arc::clone(&config)).await?);
let state = AppState {
config: Arc::clone(&config),
posts: Arc::clone(&posts),
};
if config.cache.enable && config.cache.cleanup {
if let Some(millis) = config.cache.cleanup_interval {
if let Some(t) = config.cache.cleanup_interval {
let posts = Arc::clone(&posts);
let token = cancellation_token.child_token();
debug!("setting up cleanup task");
tasks.spawn(async move {
let mut interval = tokio::time::interval(Duration::from_millis(millis.into()));
let mut interval = tokio::time::interval(Duration::from_millis(t));
loop {
select! {
_ = token.cancelled() => break Ok(()),
_ = token.cancelled() => break,
_ = interval.tick() => {
posts.cleanup().await
}
@ -137,11 +82,6 @@ async fn main() -> eyre::Result<()> {
}
}
let state = AppState {
config: Arc::clone(&config),
posts,
templates: registry,
};
let app = app::new(&config).with_state(state.clone());
let listener = TcpListener::bind(socket_addr)
@ -153,7 +93,13 @@ async fn main() -> eyre::Result<()> {
info!("listening on http://{}", local_addr);
let sigint = signal::ctrl_c();
let sigterm = platform::sigterm();
#[cfg(unix)]
let mut sigterm_handler =
tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())?;
#[cfg(unix)]
let sigterm = sigterm_handler.recv();
#[cfg(not(unix))] // TODO: kill all windows server users
let sigterm = std::future::pending::<()>();
let axum_token = cancellation_token.child_token();
@ -181,10 +127,7 @@ async fn main() -> eyre::Result<()> {
cancellation_token.cancel();
server.await.context("failed to serve app")?;
while let Some(task) = tasks.join_next().await {
let res = task.context("failed to join task")?;
if let Err(err) = res {
error!("task failed with error: {err}");
}
task.context("failed to join task")?;
}
drop(state);
@ -192,7 +135,13 @@ async fn main() -> eyre::Result<()> {
};
let sigint = signal::ctrl_c();
let sigterm = platform::sigterm();
#[cfg(unix)]
let mut sigterm_handler =
tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())?;
#[cfg(unix)]
let sigterm = sigterm_handler.recv();
#[cfg(not(unix))]
let sigterm = std::future::pending::<()>();
tokio::select! {
result = cleanup => {

View file

@ -1,5 +1,5 @@
use color_eyre::eyre::{self, Context};
use comrak::adapters::SyntaxHighlighterAdapter;
use std::sync::{Arc, OnceLock, RwLock};
use comrak::markdown_to_html_with_plugins;
use comrak::plugins::syntect::{SyntectAdapter, SyntectAdapterBuilder};
use comrak::ComrakOptions;
@ -7,30 +7,32 @@ use comrak::RenderPlugins;
use syntect::highlighting::ThemeSet;
use crate::config::RenderConfig;
use crate::hash_arc_store::HashArcStore;
pub fn build_syntect(config: &RenderConfig) -> eyre::Result<SyntectAdapter> {
fn syntect_adapter(config: &RenderConfig) -> Arc<SyntectAdapter> {
static STATE: OnceLock<RwLock<HashArcStore<SyntectAdapter, RenderConfig>>> = OnceLock::new();
let lock = STATE.get_or_init(|| RwLock::new(HashArcStore::new()));
let mut guard = lock.write().unwrap();
guard.get_or_init(config, build_syntect)
}
fn build_syntect(config: &RenderConfig) -> Arc<SyntectAdapter> {
let mut theme_set = if config.syntect.load_defaults {
ThemeSet::load_defaults()
} else {
ThemeSet::new()
};
if let Some(path) = config.syntect.themes_dir.as_ref() {
theme_set
.add_from_folder(path)
.with_context(|| format!("failed to add themes from {path:?}"))?;
theme_set.add_from_folder(path).unwrap();
}
let mut builder = SyntectAdapterBuilder::new().theme_set(theme_set);
if let Some(theme) = config.syntect.theme.as_ref() {
builder = builder.theme(theme);
}
Ok(builder.build())
Arc::new(builder.build())
}
pub fn render(
markdown: &str,
config: &RenderConfig,
syntect: Option<&dyn SyntaxHighlighterAdapter>,
) -> String {
pub fn render(markdown: &str, config: &RenderConfig) -> String {
let mut options = ComrakOptions::default();
options.extension.table = true;
options.extension.autolink = true;
@ -39,13 +41,15 @@ pub fn render(
options.extension.strikethrough = true;
options.extension.multiline_block_quotes = true;
options.extension.header_ids = Some(String::new());
options.render.escape = config.escape;
options.render.unsafe_ = config.unsafe_;
let mut render_plugins = RenderPlugins::default();
render_plugins.codefence_syntax_highlighter = syntect;
let syntect = syntect_adapter(config);
render_plugins.codefence_syntax_highlighter = Some(syntect.as_ref());
let plugins = comrak::Plugins::builder().render(render_plugins).build();
let plugins = comrak::PluginsBuilder::default()
.render(render_plugins)
.build()
.unwrap();
markdown_to_html_with_plugins(markdown, &options, &plugins)
}

View file

@ -1,9 +0,0 @@
pub async fn sigterm() -> Result<Option<()>, std::io::Error> {
#[cfg(unix)]
let mut sigterm_handler =
tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())?;
#[cfg(unix)]
return Ok(sigterm_handler.recv().await);
#[cfg(not(unix))]
std::future::pending::<None>().await
}

View file

@ -1,340 +0,0 @@
use std::collections::BTreeSet;
use std::hash::{DefaultHasher, Hash, Hasher};
use std::path::Path;
use std::process::Stdio;
use std::sync::Arc;
use std::time::Duration;
use axum::async_trait;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use futures::stream::FuturesUnordered;
use futures::{FutureExt, StreamExt};
use indexmap::IndexMap;
use serde::Deserialize;
use serde_value::Value;
use tokio::fs::OpenOptions;
use tokio::io::{AsyncBufReadExt, AsyncReadExt, BufReader};
use tokio::time::Instant;
use tracing::{debug, error, info, instrument};
use crate::error::PostError;
use crate::post::Filter;
use crate::systemtime_as_secs::as_secs;
use super::cache::{CacheGuard, CacheValue};
use super::{ApplyFilters, PostManager, PostMetadata, RenderStats, ReturnedPost};
#[derive(Deserialize, Debug)]
struct BlagMetadata {
pub title: Arc<str>,
pub description: Arc<str>,
pub author: Arc<str>,
pub icon: Option<Arc<str>>,
pub icon_alt: Option<Arc<str>>,
pub color: Option<Arc<str>>,
#[serde(alias = "created_at")]
pub written_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
#[serde(default)]
pub tags: BTreeSet<Arc<str>>,
pub dont_cache: bool,
pub raw: Option<Arc<str>>,
}
impl BlagMetadata {
pub fn into_full(self, name: Arc<str>) -> (PostMetadata, bool, Option<Arc<str>>) {
(
PostMetadata {
name,
title: self.title,
description: self.description,
author: self.author,
icon: self.icon,
icon_alt: self.icon_alt,
color: self.color,
written_at: self.written_at,
modified_at: self.modified_at,
tags: self.tags.into_iter().collect(),
},
self.dont_cache,
self.raw,
)
}
}
pub struct Blag {
root: Arc<Path>,
blag_bin: Arc<Path>,
cache: Option<Arc<CacheGuard>>,
_fastblag: bool,
}
enum RenderResult {
Normal(PostMetadata, String, (Duration, Duration), bool),
Raw(Vec<u8>, Arc<str>),
}
impl Blag {
pub fn new(root: Arc<Path>, blag_bin: Arc<Path>, cache: Option<Arc<CacheGuard>>) -> Blag {
Self {
root,
blag_bin,
cache,
_fastblag: false,
}
}
async fn render(
&self,
name: Arc<str>,
path: impl AsRef<Path>,
query_json: String,
) -> Result<RenderResult, PostError> {
let start = Instant::now();
debug!(%name, "rendering");
let mut cmd = tokio::process::Command::new(&*self.blag_bin)
.arg(path.as_ref())
.env("BLAG_QUERY", query_json)
.stdout(Stdio::piped())
.stderr(Stdio::inherit())
.stdin(Stdio::null())
.spawn()
.map_err(|err| {
error!("failed to spawn {:?}: {err}", self.blag_bin);
err
})?;
let stdout = cmd.stdout.take().unwrap();
let mut reader = BufReader::new(stdout);
let mut buf = String::new();
reader.read_line(&mut buf).await?;
let blag_meta: BlagMetadata = serde_json::from_str(&buf)?;
debug!("blag meta: {blag_meta:?}");
let (meta, dont_cache, raw) = blag_meta.into_full(name);
buf.clear();
// this is morally reprehensible
if let Some(raw) = raw {
let mut buf = buf.into_bytes();
reader.read_to_end(&mut buf).await?;
return Ok(RenderResult::Raw(buf, raw));
}
let parsed = start.elapsed();
let rendering = Instant::now();
reader.read_to_string(&mut buf).await?;
let status = cmd.wait().await?;
debug!("exited: {status}");
if !status.success() {
return Err(PostError::RenderError(status.to_string()));
}
let rendered = rendering.elapsed();
Ok(RenderResult::Normal(
meta,
buf,
(parsed, rendered),
dont_cache,
))
}
}
#[async_trait]
impl PostManager for Blag {
async fn get_all_posts(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
) -> Result<Vec<(PostMetadata, Arc<str>, RenderStats)>, PostError> {
let mut set = FuturesUnordered::new();
let mut posts = Vec::new();
let mut files = tokio::fs::read_dir(&self.root).await?;
loop {
let entry = match files.next_entry().await {
Ok(Some(v)) => v,
Ok(None) => break,
Err(err) => {
error!("error while getting next entry: {err}");
continue;
}
};
let stat = tokio::fs::metadata(entry.path()).await?;
if stat.is_file() {
let mut name = match entry.file_name().into_string() {
Ok(v) => v,
Err(_) => {
continue;
}
};
if self.is_raw(&name) {
name.truncate(name.len() - 3);
let name = name.into();
set.push(self.get_post(Arc::clone(&name), query).map(|v| (name, v)));
}
}
}
while let Some((name, result)) = set.next().await {
let post = match result {
Ok(v) => match v {
ReturnedPost::Rendered { meta, body, perf } => (meta, body, perf),
ReturnedPost::Raw { .. } => unreachable!(),
},
Err(err) => {
error!("error while rendering blagpost {name:?}: {err}");
continue;
}
};
if post.0.apply_filters(filters) {
posts.push(post);
}
}
debug!("collected posts");
Ok(posts)
}
#[instrument(skip(self))]
async fn get_post(
&self,
name: Arc<str>,
query: &IndexMap<String, Value>,
) -> Result<ReturnedPost, PostError> {
let start = Instant::now();
let mut path = self.root.join(&*name);
if self.is_raw(&name) {
let mut buffer = Vec::new();
let mut file =
OpenOptions::new()
.read(true)
.open(&path)
.await
.map_err(|err| match err.kind() {
std::io::ErrorKind::NotFound => PostError::NotFound(name),
_ => PostError::IoError(err),
})?;
file.read_to_end(&mut buffer).await?;
return Ok(ReturnedPost::Raw {
buffer,
content_type: HeaderValue::from_static("text/x-shellscript"),
});
} else {
path.add_extension("sh");
}
let stat = tokio::fs::metadata(&path)
.await
.map_err(|err| match err.kind() {
std::io::ErrorKind::NotFound => PostError::NotFound(name.clone()),
_ => PostError::IoError(err),
})?;
if !stat.is_file() {
return Err(PostError::NotFound(name));
}
let mtime = as_secs(stat.modified()?);
let query_json = serde_json::to_string(&query).expect("this should not fail");
let mut hasher = DefaultHasher::new();
query_json.hash(&mut hasher);
let query_hash = hasher.finish();
let post = if let Some(cache) = &self.cache
&& let Some(CacheValue { meta, body, .. }) =
cache.lookup(name.clone(), mtime, query_hash).await
{
ReturnedPost::Rendered {
meta,
body,
perf: RenderStats::Cached(start.elapsed()),
}
} else {
let (meta, content, (parsed, rendered), dont_cache) =
match self.render(name.clone(), path, query_json).await? {
RenderResult::Normal(x, y, z, w) => (x, y, z, w),
RenderResult::Raw(buffer, content_type) => {
return Ok(ReturnedPost::Raw {
buffer,
content_type: HeaderValue::from_str(&content_type)
.map_err(Into::into)
.map_err(PostError::Other)?,
});
}
};
let body = content.into();
if !dont_cache && let Some(cache) = &self.cache {
cache
.insert(name, meta.clone(), mtime, Arc::clone(&body), query_hash)
.await;
}
let total = start.elapsed();
ReturnedPost::Rendered {
meta,
body,
perf: RenderStats::Rendered {
total,
parsed,
rendered,
},
}
};
if let ReturnedPost::Rendered { perf, .. } = &post {
info!("rendered blagpost in {:?}", perf);
}
Ok(post)
}
async fn cleanup(&self) {
if let Some(cache) = &self.cache {
cache
.cleanup(|key, value| {
let mtime = std::fs::metadata(
self.root
.join(self.as_raw(&key.name).unwrap_or_else(|| unreachable!())),
)
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(as_secs);
match mtime {
Some(mtime) => mtime <= value.mtime,
None => false,
}
})
.await
}
}
fn is_raw(&self, name: &str) -> bool {
name.ends_with(".sh")
}
fn as_raw(&self, name: &str) -> Option<String> {
let mut buf = String::with_capacity(name.len() + 3);
buf += name;
buf += ".sh";
Some(buf)
}
}

View file

@ -1,79 +1,53 @@
use std::fmt::Debug;
use std::io::{Read, Write};
use std::num::NonZeroU64;
use std::ops::Deref;
use std::sync::Arc;
use std::time::SystemTime;
use std::hash::{DefaultHasher, Hash, Hasher};
use std::io::Read;
use crate::config::CacheConfig;
use crate::config::{Config, RenderConfig};
use crate::post::PostMetadata;
use color_eyre::eyre::{self, Context};
use scc::HashMap;
use serde::{Deserialize, Serialize};
use tokio::io::AsyncReadExt;
use tracing::{debug, info, instrument, trace, Span};
use tracing::{debug, instrument};
/// do not persist cache if this version number changed
pub const CACHE_VERSION: u16 = 5;
pub const CACHE_VERSION: u16 = 2;
fn now() -> u128 {
crate::systemtime_as_secs::as_millis(SystemTime::now())
}
#[derive(Serialize, Deserialize, Clone, Debug)]
#[derive(Serialize, Deserialize, Clone)]
pub struct CacheValue {
pub meta: PostMetadata,
pub body: Arc<str>,
pub metadata: PostMetadata,
pub rendered: String,
pub mtime: u64,
/// when the item was inserted into cache, in milliseconds since epoch
pub cached_at: u128,
config_hash: u64,
}
#[derive(Serialize, Deserialize, Clone)]
pub struct Cache {
map: HashMap<CacheKey, CacheValue>,
version: u16,
#[serde(skip)]
ttl: Option<NonZeroU64>,
}
pub struct Cache(HashMap<String, CacheValue>, u16);
#[derive(Serialize, Deserialize, Hash, Eq, PartialEq, Clone, Debug)]
#[repr(C)]
pub struct CacheKey {
pub name: Arc<str>,
pub extra: u64,
impl Default for Cache {
fn default() -> Self {
Self(Default::default(), CACHE_VERSION)
}
}
impl Cache {
pub fn new(ttl: Option<NonZeroU64>) -> Self {
Cache {
map: Default::default(),
version: CACHE_VERSION,
ttl,
}
}
fn up_to_date(&self, cached: &CacheValue, mtime: u64) -> bool {
mtime <= cached.mtime
&& self
.ttl
.is_none_or(|ttl| cached.cached_at + u64::from(ttl) as u128 >= now())
}
#[instrument(level = "debug", skip(self), fields(entry_mtime))]
pub async fn lookup(&self, name: Arc<str>, mtime: u64, extra: u64) -> Option<CacheValue> {
trace!("looking up in cache");
match self.map.get_async(&CacheKey { name, extra }).await {
pub async fn lookup(
&self,
name: &str,
mtime: u64,
config: &RenderConfig,
) -> Option<CacheValue> {
match self.0.get_async(name).await {
Some(entry) => {
let cached = entry.get();
Span::current().record("entry_mtime", cached.mtime);
trace!("found in cache");
if self.up_to_date(cached, mtime) {
trace!("entry up-to-date");
if mtime <= cached.mtime && {
let mut hasher = DefaultHasher::new();
config.hash(&mut hasher);
hasher.finish()
} == cached.config_hash
{
Some(cached.clone())
} else {
let _ = entry.remove();
debug!("removed stale entry");
None
}
}
@ -81,24 +55,14 @@ impl Cache {
}
}
#[instrument(level = "debug", skip(self), fields(entry_mtime))]
pub async fn lookup_metadata(
&self,
name: Arc<str>,
mtime: u64,
extra: u64,
) -> Option<PostMetadata> {
trace!("looking up metadata in cache");
match self.map.get_async(&CacheKey { name, extra }).await {
pub async fn lookup_metadata(&self, name: &str, mtime: u64) -> Option<PostMetadata> {
match self.0.get_async(name).await {
Some(entry) => {
let cached = entry.get();
Span::current().record("entry_mtime", cached.mtime);
if self.up_to_date(cached, mtime) {
trace!("entry up-to-date");
Some(cached.meta.clone())
if mtime <= cached.mtime {
Some(cached.metadata.clone())
} else {
let _ = entry.remove();
debug!("removed stale entry");
None
}
}
@ -106,162 +70,78 @@ impl Cache {
}
}
#[instrument(level = "debug", skip(self))]
pub async fn insert(
&self,
name: Arc<str>,
name: String,
metadata: PostMetadata,
mtime: u64,
rendered: Arc<str>,
extra: u64,
) -> Option<CacheValue> {
trace!("inserting into cache");
rendered: String,
config: &RenderConfig,
) -> Result<(), (String, (PostMetadata, String))> {
let mut hasher = DefaultHasher::new();
config.hash(&mut hasher);
let hash = hasher.finish();
let r = self
.map
.upsert_async(
CacheKey { name, extra },
CacheValue {
meta: metadata,
body: rendered,
mtime,
cached_at: now(),
},
)
.await;
let value = CacheValue {
metadata,
rendered,
mtime,
config_hash: hash,
};
debug!(
"{} cache",
match r {
Some(_) => "updated in",
None => "inserted into",
}
);
r
if self
.0
.update_async(&name, |_, _| value.clone())
.await
.is_none()
{
self.0
.insert_async(name, value)
.await
.map_err(|x| (x.0, (x.1.metadata, x.1.rendered)))
} else {
Ok(())
}
}
#[instrument(level = "debug", skip(self))]
#[allow(unused)]
pub async fn remove(&self, name: Arc<str>, extra: u64) -> Option<(CacheKey, CacheValue)> {
trace!("removing from cache");
let r = self.map.remove_async(&CacheKey { name, extra }).await;
debug!(
"item {} cache",
match r {
Some(_) => "removed from",
None => "did not exist in",
}
);
r
pub async fn remove(&self, name: &str) -> Option<(String, CacheValue)> {
self.0.remove_async(name).await
}
pub async fn retain(&self, predicate: impl Fn(&CacheKey, &CacheValue) -> bool) {
let old_size = self.map.len();
#[instrument(name = "cleanup", skip_all)]
pub async fn cleanup(&self, get_mtime: impl Fn(&str) -> Option<u64>) {
let old_size = self.0.len();
let mut i = 0;
// TODO: multithread
// not urgent as this is run concurrently anyways
self.map
self.0
.retain_async(|k, v| {
if predicate(k, v) {
if get_mtime(k).is_some_and(|mtime| mtime == v.mtime) {
true
} else {
debug!("removing {k:?} from cache");
debug!("removing {k} from cache");
i += 1;
false
}
})
.await;
let new_size = self.len();
let new_size = self.0.len();
debug!("removed {i} entries ({old_size} -> {new_size} entries)");
}
#[instrument(level = "debug", skip_all)]
pub async fn cleanup(&self, predicate: impl Fn(&CacheKey, &CacheValue) -> bool) {
self.retain(|k, v| {
self.ttl
.is_none_or(|ttl| v.cached_at + u64::from(ttl) as u128 >= now())
&& predicate(k, v)
})
.await
}
pub fn len(&self) -> usize {
self.map.len()
}
#[inline(always)]
pub fn version(&self) -> u16 {
self.version
self.1
}
}
pub struct CacheGuard {
inner: Cache,
config: CacheConfig,
}
impl CacheGuard {
pub fn new(cache: Cache, config: CacheConfig) -> Self {
Self {
inner: cache,
config,
}
}
fn try_drop(&mut self) -> Result<(), eyre::Report> {
// write cache to file
let path = &self.config.file;
let serialized = bitcode::serialize(&self.inner).context("failed to serialize cache")?;
let mut cache_file = std::fs::File::create(path)
.with_context(|| format!("failed to open cache at {}", path.display()))?;
let compression_level = self.config.compression_level;
if self.config.compress {
std::io::Write::write_all(
&mut zstd::stream::write::Encoder::new(cache_file, compression_level)?
.auto_finish(),
&serialized,
)
} else {
cache_file.write_all(&serialized)
}
.context("failed to write cache to file")?;
info!("wrote cache to {}", path.display());
Ok(())
}
}
impl Deref for CacheGuard {
type Target = Cache;
fn deref(&self) -> &Self::Target {
&self.inner
}
}
impl AsRef<Cache> for CacheGuard {
fn as_ref(&self) -> &Cache {
&self.inner
}
}
impl Drop for CacheGuard {
fn drop(&mut self) {
self.try_drop().expect("cache to save successfully")
}
}
pub(crate) async fn load_cache(config: &CacheConfig) -> Result<Cache, eyre::Report> {
let path = &config.file;
pub(crate) async fn load_cache(config: &Config) -> Result<Cache, eyre::Report> {
let path = &config.cache.file;
let mut cache_file = tokio::fs::File::open(&path)
.await
.context("failed to open cache file")?;
let serialized = if config.compress {
let serialized = if config.cache.compress {
let cache_file = cache_file.into_std().await;
tokio::task::spawn_blocking(move || {
let mut buf = Vec::with_capacity(4096);
@ -279,10 +159,5 @@ pub(crate) async fn load_cache(config: &CacheConfig) -> Result<Cache, eyre::Repo
buf
};
let mut cache: Cache =
bitcode::deserialize(serialized.as_slice()).context("failed to parse cache")?;
cache.ttl = config.ttl;
Ok(cache)
bitcode::deserialize(serialized.as_slice()).context("failed to parse cache")
}

View file

@ -1,53 +1,42 @@
use std::collections::BTreeSet;
use std::hash::{DefaultHasher, Hash, Hasher};
use std::io;
use std::io::{self, Write};
use std::ops::Deref;
use std::path::Path;
use std::sync::Arc;
use std::time::Duration;
use std::time::Instant;
use std::time::SystemTime;
use axum::async_trait;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use color_eyre::eyre::{self, Context};
use comrak::plugins::syntect::SyntectAdapter;
use fronma::parser::{parse, ParsedData};
use indexmap::IndexMap;
use serde::Deserialize;
use serde_value::Value;
use tokio::fs;
use tokio::io::AsyncReadExt;
use tracing::{info, instrument, warn};
use tracing::{error, info, warn};
use crate::config::Config;
use crate::markdown_render::{build_syntect, render};
use crate::markdown_render::render;
use crate::post::cache::{load_cache, Cache, CACHE_VERSION};
use crate::post::{PostError, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::systemtime_as_secs::as_secs;
use super::cache::{CacheGuard, CacheKey, CacheValue};
use super::{
ApplyFilters, Filter, PostError, PostManager, PostMetadata, RenderStats, ReturnedPost,
};
#[derive(Deserialize)]
struct FrontMatter {
pub title: Arc<str>,
pub description: Arc<str>,
pub author: Arc<str>,
pub icon: Option<Arc<str>>,
pub icon_alt: Option<Arc<str>>,
pub color: Option<Arc<str>>,
#[serde(alias = "created_at")]
pub written_at: Option<DateTime<Utc>>,
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
#[serde(default)]
pub tags: BTreeSet<Arc<str>>,
pub tags: BTreeSet<String>,
}
impl FrontMatter {
pub fn into_full(
self,
name: Arc<str>,
name: String,
created: Option<SystemTime>,
modified: Option<SystemTime>,
) -> PostMetadata {
@ -57,47 +46,62 @@ impl FrontMatter {
description: self.description,
author: self.author,
icon: self.icon,
icon_alt: self.icon_alt,
color: self.color,
written_at: self.written_at.or_else(|| created.map(|t| t.into())),
created_at: self.created_at.or_else(|| created.map(|t| t.into())),
modified_at: self.modified_at.or_else(|| modified.map(|t| t.into())),
tags: self.tags.into_iter().collect(),
}
}
}
pub struct MarkdownPosts {
cache: Option<Arc<CacheGuard>>,
config: Arc<Config>,
render_hash: u64,
syntect: SyntectAdapter,
pub struct MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
cache: Option<Cache>,
config: C,
}
impl MarkdownPosts {
pub async fn new(
config: Arc<Config>,
cache: Option<Arc<CacheGuard>>,
) -> eyre::Result<MarkdownPosts> {
let syntect =
build_syntect(&config.render).context("failed to create syntax highlighting engine")?;
impl<C> MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
pub async fn new(config: C) -> eyre::Result<MarkdownPosts<C>> {
if config.cache.enable {
if config.cache.persistence && tokio::fs::try_exists(&config.cache.file).await? {
info!("loading cache from file");
let mut cache = load_cache(&config).await.unwrap_or_else(|err| {
error!("failed to load cache: {}", err);
info!("using empty cache");
Default::default()
});
let mut hasher = DefaultHasher::new();
config.render.hash(&mut hasher);
let render_hash = hasher.finish();
if cache.version() < CACHE_VERSION {
warn!("cache version changed, clearing cache");
cache = Default::default();
};
Ok(Self {
cache,
config,
render_hash,
syntect,
})
Ok(Self {
cache: Some(cache),
config,
})
} else {
Ok(Self {
cache: Some(Default::default()),
config,
})
}
} else {
Ok(Self {
cache: None,
config,
})
}
}
async fn parse_and_render(
&self,
name: Arc<str>,
name: String,
path: impl AsRef<Path>,
) -> Result<(PostMetadata, Arc<str>, (Duration, Duration)), PostError> {
) -> Result<(PostMetadata, String, (Duration, Duration)), PostError> {
let parsing_start = Instant::now();
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(val) => val,
@ -118,64 +122,73 @@ impl MarkdownPosts {
let parsing = parsing_start.elapsed();
let before_render = Instant::now();
let post = render(body, &self.config.render, Some(&self.syntect)).into();
let post = render(body, &self.config.render);
let rendering = before_render.elapsed();
if let Some(cache) = &self.cache {
if let Some(cache) = self.cache.as_ref() {
cache
.insert(
name.clone(),
name.to_string(),
metadata.clone(),
as_secs(modified),
Arc::clone(&post),
self.render_hash,
as_secs(&modified),
post.clone(),
&self.config.render,
)
.await;
}
.await
.unwrap_or_else(|err| warn!("failed to insert {:?} into cache", err.0))
};
Ok((metadata, post, (parsing, rendering)))
}
}
#[async_trait]
impl PostManager for MarkdownPosts {
async fn get_all_posts(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
) -> Result<Vec<(PostMetadata, Arc<str>, RenderStats)>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.dirs.posts).await?;
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string()
.into();
let post = self.get_post(Arc::clone(&name), query).await?;
if let ReturnedPost::Rendered { meta, body, perf } = post
&& meta.apply_filters(filters)
{
posts.push((meta, body, perf));
}
}
}
Ok(posts)
fn cache(&self) -> Option<&Cache> {
self.cache.as_ref()
}
fn try_drop(&mut self) -> Result<(), eyre::Report> {
// write cache to file
let config = &self.config.cache;
if config.enable
&& config.persistence
&& let Some(cache) = self.cache()
{
let path = &config.file;
let serialized = bitcode::serialize(cache).context("failed to serialize cache")?;
let mut cache_file = std::fs::File::create(path)
.with_context(|| format!("failed to open cache at {}", path.display()))?;
let compression_level = config.compression_level;
if config.compress {
std::io::Write::write_all(
&mut zstd::stream::write::Encoder::new(cache_file, compression_level)?
.auto_finish(),
&serialized,
)
} else {
cache_file.write_all(&serialized)
}
.context("failed to write cache to file")?;
info!("wrote cache to {}", path.display());
}
Ok(())
}
}
impl<C> Drop for MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
fn drop(&mut self) {
self.try_drop().unwrap()
}
}
impl<C> PostManager for MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
async fn get_all_post_metadata(
&self,
filters: &[Filter<'_>],
_query: &IndexMap<String, Value>,
filter: impl Fn(&PostMetadata) -> bool,
) -> Result<Vec<PostMetadata>, PostError> {
let mut posts = Vec::new();
@ -185,21 +198,24 @@ impl PostManager for MarkdownPosts {
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let mtime = as_secs(stat.modified()?);
let name: Arc<str> =
String::from(path.file_stem().unwrap().to_string_lossy()).into();
let mtime = as_secs(&stat.modified()?);
// TODO. this?
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
if let Some(cache) = &self.cache
&& let Some(hit) = cache
.lookup_metadata(name.clone(), mtime, self.render_hash)
.await
&& hit.apply_filters(filters)
if let Some(cache) = self.cache.as_ref()
&& let Some(hit) = cache.lookup_metadata(&name, mtime).await
&& filter(&hit)
{
posts.push(hit);
} else {
match self.parse_and_render(name, path).await {
Ok((metadata, ..)) => {
if metadata.apply_filters(filters) {
if filter(&metadata) {
posts.push(metadata);
}
}
@ -219,119 +235,110 @@ impl PostManager for MarkdownPosts {
Ok(posts)
}
#[instrument(level = "info", skip(self))]
async fn get_post(
async fn get_all_posts(
&self,
name: Arc<str>,
_query: &IndexMap<String, Value>,
) -> Result<ReturnedPost, PostError> {
let post = if self.config.markdown_access && self.is_raw(&name) {
let path = self.config.dirs.posts.join(&*name);
filter: impl Fn(&PostMetadata, &str) -> bool,
) -> Result<Vec<(PostMetadata, String, RenderStats)>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.dirs.posts).await?;
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
let post = self.get_post(&name).await?;
if let ReturnedPost::Rendered(meta, content, stats) = post
&& filter(&meta, &content)
{
posts.push((meta, content, stats));
}
}
}
Ok(posts)
}
async fn get_post(&self, name: &str) -> Result<ReturnedPost, PostError> {
if self.config.markdown_access && name.ends_with(".md") {
let path = self.config.dirs.posts.join(name);
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(value) => value,
Err(err) => {
return match err.kind() {
io::ErrorKind::NotFound => Err(PostError::NotFound(name)),
_ => Err(PostError::IoError(err)),
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
if let Some(cache) = self.cache.as_ref() {
cache.remove(name).await;
}
return Err(PostError::NotFound(name.to_string()));
}
}
_ => return Err(PostError::IoError(err)),
},
};
let mut buffer = Vec::with_capacity(4096);
let mut buf = Vec::with_capacity(4096);
file.read_to_end(&mut buffer).await?;
file.read_to_end(&mut buf).await?;
ReturnedPost::Raw {
buffer,
content_type: HeaderValue::from_static("text/plain"),
}
Ok(ReturnedPost::Raw(
buf,
HeaderValue::from_static("text/plain"),
))
} else {
let start = Instant::now();
let path = self
.config
.dirs
.posts
.join(self.as_raw(&name).unwrap_or_else(|| unreachable!()));
let path = self.config.dirs.posts.join(name.to_owned() + ".md");
let stat = match tokio::fs::metadata(&path).await {
Ok(value) => value,
Err(err) => {
return match err.kind() {
io::ErrorKind::NotFound => Err(PostError::NotFound(name)),
_ => Err(PostError::IoError(err)),
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
if let Some(cache) = self.cache.as_ref() {
cache.remove(name).await;
}
return Err(PostError::NotFound(name.to_string()));
}
}
_ => return Err(PostError::IoError(err)),
},
};
let mtime = as_secs(stat.modified()?);
let mtime = as_secs(&stat.modified()?);
if let Some(cache) = &self.cache
&& let Some(CacheValue { meta, body, .. }) =
cache.lookup(name.clone(), mtime, self.render_hash).await
if let Some(cache) = self.cache.as_ref()
&& let Some(hit) = cache.lookup(name, mtime, &self.config.render).await
{
ReturnedPost::Rendered {
meta,
body,
perf: RenderStats::Cached(start.elapsed()),
}
Ok(ReturnedPost::Rendered(
hit.metadata,
hit.rendered,
RenderStats::Cached(start.elapsed()),
))
} else {
let (meta, body, stats) = self.parse_and_render(name, path).await?;
ReturnedPost::Rendered {
meta,
body,
perf: RenderStats::Rendered {
total: start.elapsed(),
parsed: stats.0,
rendered: stats.1,
},
}
let (metadata, rendered, stats) =
self.parse_and_render(name.to_string(), path).await?;
Ok(ReturnedPost::Rendered(
metadata,
rendered,
RenderStats::ParsedAndRendered(start.elapsed(), stats.0, stats.1),
))
}
};
if let ReturnedPost::Rendered { perf, .. } = &post {
info!("rendered post in {:?}", perf);
}
Ok(post)
}
async fn cleanup(&self) {
if let Some(cache) = &self.cache {
if let Some(cache) = self.cache.as_ref() {
cache
.cleanup(|CacheKey { name, extra }, value| {
// nuke entries with different render options
if self.render_hash != *extra {
return false;
}
let mtime = std::fs::metadata(
self.config
.dirs
.posts
.join(self.as_raw(name).unwrap_or_else(|| unreachable!())),
)
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(as_secs);
match mtime {
Some(mtime) => mtime <= value.mtime,
None => false,
}
.cleanup(|name| {
std::fs::metadata(self.config.dirs.posts.join(name.to_owned() + ".md"))
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(|mtime| as_secs(&mtime))
})
.await
}
}
fn is_raw(&self, name: &str) -> bool {
name.ends_with(".md")
}
fn as_raw(&self, name: &str) -> Option<String> {
let mut buf = String::with_capacity(name.len() + 3);
buf += name;
buf += ".md";
Some(buf)
}
}

View file

@ -1,127 +1,65 @@
pub mod blag;
pub mod cache;
pub mod markdown_posts;
use std::sync::Arc;
use std::time::Duration;
use axum::async_trait;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use indexmap::IndexMap;
use serde::{Deserialize, Serialize};
use serde_value::Value;
use crate::error::PostError;
pub use blag::Blag;
pub use markdown_posts::MarkdownPosts;
pub use crate::post::markdown_posts::MarkdownPosts;
// TODO: replace String with Arc<str>
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PostMetadata {
pub name: Arc<str>,
pub title: Arc<str>,
pub description: Arc<str>,
pub author: Arc<str>,
pub icon: Option<Arc<str>>,
pub icon_alt: Option<Arc<str>>,
pub color: Option<Arc<str>>,
pub written_at: Option<DateTime<Utc>>,
pub name: String,
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
pub tags: Vec<Arc<str>>,
pub tags: Vec<String>,
}
#[derive(Serialize, Debug, Clone)]
#[allow(unused)]
pub enum RenderStats {
Cached(Duration),
Rendered {
total: Duration,
parsed: Duration,
rendered: Duration,
},
Fetched(Duration),
Other {
verb: Arc<str>,
time: Duration,
},
Unknown,
// format: Total, Parsed in, Rendered in
ParsedAndRendered(Duration, Duration, Duration),
}
#[allow(clippy::large_enum_variant)] // Raw will be returned very rarely
#[derive(Debug, Clone)]
pub enum ReturnedPost {
Rendered {
meta: PostMetadata,
body: Arc<str>,
perf: RenderStats,
},
Raw {
buffer: Vec<u8>,
content_type: HeaderValue,
},
Rendered(PostMetadata, String, RenderStats),
Raw(Vec<u8>, HeaderValue),
}
pub enum Filter<'a> {
Tags(&'a [&'a str]),
}
impl Filter<'_> {
pub fn apply(&self, meta: &PostMetadata) -> bool {
match self {
Filter::Tags(tags) => tags
.iter()
.any(|tag| meta.tags.iter().any(|meta_tag| &**meta_tag == *tag)),
}
}
}
pub trait ApplyFilters {
fn apply_filters(&self, filters: &[Filter<'_>]) -> bool;
}
impl ApplyFilters for PostMetadata {
fn apply_filters(&self, filters: &[Filter<'_>]) -> bool {
for filter in filters {
if !filter.apply(self) {
return false;
}
}
true
}
}
#[async_trait]
pub trait PostManager {
async fn get_all_post_metadata(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
filter: impl Fn(&PostMetadata) -> bool,
) -> Result<Vec<PostMetadata>, PostError> {
self.get_all_posts(filters, query)
self.get_all_posts(|m, _| filter(m))
.await
.map(|vec| vec.into_iter().map(|(meta, ..)| meta).collect())
}
async fn get_all_posts(
&self,
filters: &[Filter<'_>],
query: &IndexMap<String, Value>,
) -> Result<Vec<(PostMetadata, Arc<str>, RenderStats)>, PostError>;
filter: impl Fn(&PostMetadata, &str) -> bool,
) -> Result<Vec<(PostMetadata, String, RenderStats)>, PostError>;
async fn get_max_n_post_metadata_with_optional_tag_sorted(
&self,
n: Option<usize>,
tag: Option<&str>,
query: &IndexMap<String, Value>,
tag: Option<&String>,
) -> Result<Vec<PostMetadata>, PostError> {
let filters = tag.and(Some(Filter::Tags(tag.as_slice())));
let mut posts = self
.get_all_post_metadata(filters.as_slice(), query)
.get_all_post_metadata(|metadata| !tag.is_some_and(|tag| !metadata.tags.contains(tag)))
.await?;
// we still want some semblance of order if created_at is None so sort by mtime as well
posts.sort_unstable_by_key(|metadata| metadata.modified_at.unwrap_or_default());
posts.sort_by_key(|metadata| metadata.written_at.unwrap_or_default());
posts.sort_by_key(|metadata| metadata.created_at.unwrap_or_default());
posts.reverse();
if let Some(n) = n {
posts.truncate(n);
@ -131,32 +69,14 @@ pub trait PostManager {
}
#[allow(unused)]
async fn get_post_metadata(
&self,
name: Arc<str>,
query: &IndexMap<String, Value>,
) -> Result<PostMetadata, PostError> {
match self.get_post(name.clone(), query).await? {
ReturnedPost::Rendered { meta, .. } => Ok(meta),
ReturnedPost::Raw { .. } => Err(PostError::NotFound(name)),
async fn get_post_metadata(&self, name: &str) -> Result<PostMetadata, PostError> {
match self.get_post(name).await? {
ReturnedPost::Rendered(metadata, ..) => Ok(metadata),
ReturnedPost::Raw(..) => Err(PostError::NotFound(name.to_string())),
}
}
async fn get_post(
&self,
name: Arc<str>,
query: &IndexMap<String, Value>,
) -> Result<ReturnedPost, PostError>;
async fn get_post(&self, name: &str) -> Result<ReturnedPost, PostError>;
async fn cleanup(&self) {}
#[allow(unused)]
fn is_raw(&self, name: &str) -> bool {
false
}
#[allow(unused)]
fn as_raw(&self, name: &str) -> Option<String> {
None
}
async fn cleanup(&self);
}

View file

@ -0,0 +1,37 @@
pub struct RangedI128Visitor<const START: i128, const END: i128>;
impl<'de, const START: i128, const END: i128> serde::de::Visitor<'de>
for RangedI128Visitor<START, END>
{
type Value = i128;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "an integer between {START} and {END}")
}
fn visit_i32<E>(self, v: i32) -> std::result::Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i128(v as i128)
}
fn visit_i64<E>(self, v: i64) -> std::prelude::v1::Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i128(v as i128)
}
fn visit_i128<E>(self, v: i128) -> std::prelude::v1::Result<Self::Value, E>
where
E: serde::de::Error,
{
if v >= START && v <= END {
Ok(v)
} else {
Err(E::custom(format!(
"integer is out of bounds ({START}..{END})"
)))
}
}
}

View file

@ -1,78 +0,0 @@
use std::convert::Infallible;
use std::str::pattern::Pattern;
use axum::extract::Request;
use axum::http::{header, StatusCode};
use axum::response::{IntoResponse, Response};
use include_dir::{Dir, DirEntry};
use tracing::{debug, trace};
fn if_empty<'a>(a: &'a str, b: &'a str) -> &'a str {
if a.is_empty() {
b
} else {
a
}
}
fn remove_prefixes(mut src: &str, pat: (impl Pattern + Copy)) -> &str {
while let Some(removed) = src.strip_prefix(pat) {
src = removed;
}
src
}
fn from_included_file(file: &'static include_dir::File<'static>) -> Response {
let mime_type = mime_guess::from_path(file.path()).first_or_octet_stream();
(
[(
header::CONTENT_TYPE,
header::HeaderValue::try_from(mime_type.essence_str()).expect("invalid mime type"),
)],
file.contents(),
)
.into_response()
}
pub async fn handle(
req: Request,
included_dir: &'static Dir<'static>,
) -> Result<Response, Infallible> {
let path = req.uri().path();
let has_dotdot = path.split('/').any(|seg| seg == "..");
if has_dotdot {
return Ok(StatusCode::NOT_FOUND.into_response());
}
let relative_path = if_empty(remove_prefixes(path, '/'), ".");
match included_dir.get_entry(relative_path) {
Some(DirEntry::Dir(dir)) => {
trace!("{relative_path:?} is a directory, trying \"index.html\"");
if let Some(file) = dir.get_file("index.html") {
debug!("{path:?} (index.html) serving from included dir");
return Ok(from_included_file(file));
} else {
trace!("\"index.html\" not found in {relative_path:?} in included files");
}
}
None if relative_path == "." => {
trace!("requested root, trying \"index.html\"");
if let Some(file) = included_dir.get_file("index.html") {
debug!("{path:?} (index.html) serving from included dir");
return Ok(from_included_file(file));
} else {
trace!("\"index.html\" not found in included files");
}
}
Some(DirEntry::File(file)) => {
debug!("{path:?} serving from included dir");
return Ok(from_included_file(file));
}
None => trace!("{relative_path:?} not found in included files"),
};
Ok(StatusCode::NOT_FOUND.into_response())
}

View file

@ -1,13 +1,9 @@
use std::time::SystemTime;
pub fn as_secs(t: SystemTime) -> u64 {
t.duration_since(SystemTime::UNIX_EPOCH)
.unwrap_or_else(|err| err.duration())
.as_secs()
}
pub fn as_millis(t: SystemTime) -> u128 {
t.duration_since(SystemTime::UNIX_EPOCH)
.unwrap_or_else(|err| err.duration())
.as_millis()
pub fn as_secs(t: &SystemTime) -> u64 {
match t.duration_since(SystemTime::UNIX_EPOCH) {
Ok(duration) => duration,
Err(err) => err.duration(),
}
.as_secs()
}

View file

@ -1,153 +0,0 @@
pub mod watcher;
use std::{io, path::Path};
use handlebars::{Handlebars, Template};
use include_dir::{include_dir, Dir};
use thiserror::Error;
use tracing::{debug, error, info_span, trace};
const TEMPLATES: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/templates");
#[derive(Error, Debug)]
#[allow(clippy::enum_variant_names)]
pub enum TemplateError {
#[error(transparent)]
IoError(#[from] std::io::Error),
#[error("file doesn't contain valid UTF-8")]
UTF8Error,
#[error(transparent)]
TemplateError(#[from] handlebars::TemplateError),
}
fn is_ext(path: impl AsRef<Path>, ext: &str) -> bool {
match path.as_ref().extension() {
Some(path_ext) if path_ext != ext => false,
None => false,
_ => true,
}
}
fn get_template_name(path: &Path) -> Option<&str> {
if !is_ext(path, "hbs") {
return None;
}
path.file_stem()?.to_str()
}
fn register_included_file(
file: &include_dir::File<'_>,
name: &str,
registry: &mut Handlebars,
) -> Result<(), TemplateError> {
let template = compile_included_file(file)?;
registry.register_template(name, template);
Ok(())
}
fn register_path(
path: impl AsRef<std::path::Path>,
name: &str,
registry: &mut Handlebars<'_>,
) -> Result<(), TemplateError> {
let template = compile_path(path)?;
registry.register_template(name, template);
Ok(())
}
fn compile_included_file(file: &include_dir::File<'_>) -> Result<Template, TemplateError> {
let contents = file.contents_utf8().ok_or(TemplateError::UTF8Error)?;
let template = Template::compile(contents)?;
Ok(template)
}
fn compile_path(path: impl AsRef<std::path::Path>) -> Result<Template, TemplateError> {
use std::fs::OpenOptions;
use std::io::Read;
let mut file = OpenOptions::new().read(true).open(path)?;
let mut buf = String::new();
file.read_to_string(&mut buf)?;
let template = Template::compile(&buf)?;
Ok(template)
}
async fn compile_path_async_io(
path: impl AsRef<std::path::Path>,
) -> Result<Template, TemplateError> {
use tokio::fs::OpenOptions;
use tokio::io::AsyncReadExt;
let mut file = OpenOptions::new().read(true).open(path).await?;
let mut buf = String::new();
file.read_to_string(&mut buf).await?;
let template = Template::compile(&buf)?;
Ok(template)
}
pub fn new_registry<'a>(custom_templates_path: impl AsRef<Path>) -> io::Result<Handlebars<'a>> {
let mut reg = Handlebars::new();
for entry in TEMPLATES.entries() {
let file = match entry.as_file() {
Some(file) => file,
None => continue,
};
let span = info_span!("register_included_template", path = ?file.path());
let _handle = span.enter();
let name = match get_template_name(file.path()) {
Some(v) => v,
None => {
trace!("skipping file");
continue;
}
};
match register_included_file(file, name, &mut reg) {
Ok(()) => debug!("registered template {name:?}"),
Err(err) => error!("error while registering template: {err}"),
};
}
let read_dir = match std::fs::read_dir(custom_templates_path) {
Ok(v) => v,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => return Ok(reg),
_ => panic!("{:?}", err),
},
};
for entry in read_dir {
let entry = entry.unwrap();
let file_type = entry.file_type()?;
if !file_type.is_file() {
continue;
}
let path = entry.path();
let span = info_span!("register_custom_template", ?path);
let _handle = span.enter();
let name = match get_template_name(&path) {
Some(v) => v,
None => {
trace!("skipping file");
continue;
}
};
match register_path(&path, name, &mut reg) {
Ok(()) => debug!("registered template {name:?}"),
Err(err) => error!("error while registering template: {err}"),
};
}
Ok(reg)
}

View file

@ -1,126 +0,0 @@
use std::path::Path;
use std::sync::Arc;
use std::time::Duration;
use handlebars::{Handlebars, Template};
use notify_debouncer_full::notify::{self};
use notify_debouncer_full::{new_debouncer, DebouncedEvent};
use tokio::select;
use tokio::sync::RwLock;
use tokio_util::sync::CancellationToken;
use tracing::{debug, debug_span, error, info, trace};
use crate::templates::*;
async fn process_event(
event: DebouncedEvent,
templates: &mut Vec<(String, Template)>,
) -> Result<(), Box<dyn std::error::Error>> {
match event.kind {
notify::EventKind::Create(notify::event::CreateKind::File)
| notify::EventKind::Modify(_) => {
for path in &event.paths {
let span = debug_span!("modify_event", ?path);
let _handle = span.enter();
let template_name = match get_template_name(path) {
Some(v) => v,
None => {
trace!("skipping event");
continue;
}
};
trace!("processing recompilation");
let compiled = compile_path_async_io(path).await?;
debug!("compiled template {template_name:?}");
templates.push((template_name.to_owned(), compiled));
}
}
notify::EventKind::Remove(notify::event::RemoveKind::File) => {
for path in &event.paths {
let span = debug_span!("remove_event", ?path);
let _handle = span.enter();
let (file_name, template_name) = match path
.file_name()
.and_then(|o| o.to_str())
.and_then(|file_name| {
get_template_name(Path::new(file_name))
.map(|template_name| (file_name, template_name))
}) {
Some(v) => v,
None => {
trace!("skipping event");
continue;
}
};
trace!("processing removal");
let file = TEMPLATES.get_file(file_name);
if let Some(file) = file {
let compiled = compile_included_file(file)?;
debug!("compiled template {template_name:?}");
templates.push((template_name.to_owned(), compiled));
}
}
}
_ => {}
};
Ok(())
}
pub async fn watch_templates<'a>(
path: impl AsRef<Path>,
watcher_token: CancellationToken,
reg: Arc<RwLock<Handlebars<'a>>>,
) -> Result<(), Box<dyn std::error::Error + Send + Sync + 'static>> {
let path = path.as_ref();
let (tx, mut rx) = tokio::sync::mpsc::channel(1);
let mut debouncer = new_debouncer(Duration::from_millis(100), None, move |events| {
tx.blocking_send(events)
.expect("failed to send message over channel")
})?;
debouncer.watch(path, notify::RecursiveMode::NonRecursive)?;
'event_loop: while let Some(events) = select! {
_ = watcher_token.cancelled() => {
debug!("exiting watcher loop");
break 'event_loop;
},
events = rx.recv() => events
} {
let events = match events {
Ok(events) => events,
Err(err) => {
error!("error getting events: {err:?}");
continue;
}
};
let mut templates = Vec::new();
for event in events {
if let Err(err) = process_event(event, &mut templates).await {
error!("error while processing event: {err}");
}
}
if !templates.is_empty() {
let mut reg = reg.write().await;
for template in templates.into_iter() {
debug!("registered template {}", template.0);
reg.register_template(&template.0, template.1);
}
drop(reg);
info!("updated custom templates");
}
}
Ok(())
}

View file

@ -1,8 +0,0 @@
function replaceDates() {
for (let el of document.querySelectorAll(".date-rfc3339")) {
let date = new Date(Date.parse(el.textContent));
el.title = el.textContent;
el.textContent = date.toLocaleString();
el.classList.replace("date-rfc3339", "tooltipped");
}
}

View file

@ -1,12 +1,4 @@
replaceDates();
let form = document.getElementById("sort");
if (form) {
form.style.display = "block";
let postsByDate = document.getElementById("posts");
let postsByName = document.createElement("div");
populateByName(postsByDate, postsByName);
postsByDate.parentNode.appendChild(postsByName);
handleSort(form, postsByDate, postsByName);
sort(form.sort.value, postsByDate, postsByName);
for (let el of document.querySelectorAll(".date-rfc3339")) {
let date = new Date(Date.parse(el.textContent));
el.textContent = date.toLocaleString();
}

View file

@ -54,22 +54,3 @@ th,
td:nth-child(1) {
word-break: keep-all;
}
blockquote {
margin-left: 1em;
padding-left: 1.5em;
border-left: 0.5em solid;
border-color: var(--blue);
& > blockquote {
border-color: var(--mauve);
& > blockquote {
border-color: var(--pink);
& > blockquote {
border-color: var(--rosewater);
& > blockquote {
border-color: var(--text);
}
}
}
}
}

View file

@ -1,32 +0,0 @@
function populateByName(source, target) {
let posts = [];
for (let post of source.children) {
let title = post.firstElementChild.innerText;
posts.push([title, post.cloneNode(true)]);
}
posts.sort(([a, _1], [b, _2]) => a.toLocaleLowerCase().localeCompare(b.toLocaleLowerCase()));
for (let [_, post] of posts) {
target.appendChild(post);
}
}
function sort(by, dateEl, nameEl) {
console.log("sorting by", by);
switch (by) {
case "date":
dateEl.style.display = "block";
nameEl.style.display = "none";
break;
case "name":
nameEl.style.display = "block";
dateEl.style.display = "none";
break;
}
}
function handleSort(form, dateEl, nameEl) {
for (let el of form.sort)
el.addEventListener("change", () => {
if (el.checked) sort(el.value, dateEl, nameEl);
});
}

View file

@ -1,5 +1,4 @@
/* colors from catppuccin https://github.com/catppuccin/catppuccin
licensed under the MIT license, available in the source tree */
/* colors */
:root {
--base: #1e1e2e;
--text: #cdd6f4;
@ -28,21 +27,15 @@
}
}
:root {
/* please have one at least one good monospace font */
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Source Code Pro", "Ubuntu Mono", monospace,
"Noto Sans", sans-serif;
}
:root,
code {
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Source Code Pro", monospace;
/* please have one at least one good monospace font */
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Ubuntu Mono", monospace, sans-serif;
}
:root {
background-color: var(--base);
color: var(--text);
overflow-wrap: break-word;
}
a {
@ -104,6 +97,7 @@ div.post {
.table > .value {
margin-left: 1em;
text-align: end;
grid-column: 2;
}
@ -119,11 +113,6 @@ div.post {
grid-row: 3;
}
#sort {
display: inline-block;
margin-bottom: 1rem;
}
/* BEGIN cool effect everyone liked */
body {

View file

@ -1,25 +0,0 @@
running
<a href="{{bingus_info.repository}}" target="_blank">{{bingus_info.name}}</a>
v{{bingus_info.version}}
{{#if rendered_in}}
<b> - </b>
{{/if}}
{{#each rendered_in}}
{{#if (eq @key "Rendered")}}
<span class="tooltipped" title="parsing metadata took {{duration this.parsed}}">parsed meta</span>
and
<span class="tooltipped" title="rendering took {{duration this.rendered}}">rendered</span>
in
{{duration this.total}}
{{else if (eq @key "Cached")}}
retrieved from cache in
{{duration this}}
{{else if (eq @key "Fetched")}}
fetched in
{{duration this}}
{{/if}}
{{/each}}
{{#if raw_name}}
-
<a href="/posts/{{raw_name}}">view raw</a>
{{/if}}

View file

@ -1,65 +0,0 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="{{title}}" />
<meta property="og:title" content="{{title}}" />
<meta property="og:description" content="{{description}}" />
<meta name="keywords" content="{{joined_tags}}" />
{{#if (ne color null)}}
<meta name="theme-color" content="{{style.color}}" />
{{/if}}
<title>{{title}}</title>
<link rel="stylesheet" href="/static/style.css" />
{{#if rss}}
<link rel="alternate" type="application/rss+xml" title="{{title}}" href="/feed.xml" />
{{/if}}
{{#if js}}
<script src="/static/date.js" defer></script>
<script src="/static/sort.js" defer></script>
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body>
<main>
{{>title}}
<h2>posts</h2>
<div>
{{#if js}}
<form id="sort" style="display: none">
sort by: {{sort}}
<br />
<input type="radio" name="sort" id="sort-date" value="date" {{#if (eq style.default_sort "date")}}checked{{/if}} />
<label for="sort-date">date</label>
<input type="radio" name="sort" id="sort-name" value="name" {{#if (eq style.default_sort "name")}}checked{{/if}} />
<label for="sort-name">name</label>
</form>
{{/if}}
<div id="posts">
{{#each posts}}
<div class="post">
<a href="/posts/{{name}}"><b>{{title}}</b></a>
<span class="post-author">- by {{author}}</span>
<br />
{{description}}<br />
{{>post_table post style=@root.style}}
</div>
{{else}} there are no posts right now. check back later! {{/each}}
</div>
</div>
{{#if (gt (len tags) 0)}}
<h2>tags</h2>
<b><a href="/">clear tags</a></b>
<br />
{{/if}}
{{#each tags}}
<a href="/?tag={{@key}}" title="view all posts with this tag">{{@key}}</a>
<span class="post-author">- {{this}} post{{#if (ne this 1)}}s{{/if}}</span><br />
{{/each}}
</main>
<footer>
{{>footer}}
</footer>
</body>
</html>

46
templates/index.html Normal file
View file

@ -0,0 +1,46 @@
{%- import "macros.askama" as macros -%}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="{{ title }}" />
<meta property="og:title" content="{{ title }}" />
<meta property="og:description" content="{{ description }}" />
<title>{{ title }}</title>
<link rel="stylesheet" href="/static/style.css" />
{% if js %}
<script src="/static/main.js" defer></script>
{% endif %}
</head>
<body>
<main>
<h1>{{ title }}</h1>
<p>{{ description }}</p>
<h2>posts</h2>
<!-- prettier-ignore -->
<div>
{% if posts.is_empty() %}
there are no posts right now. check back later!
{% endif %}<!-- prettier-br -->
{% for post in posts %}
<div class="post">
<a href="/posts/{{ post.name }}"><b>{{ post.title }}</b></a>
<span class="post-author">- by {{ post.author }}</span>
<br />
{{ post.description }}<br />
{% call macros::table(post) %}
</div>
{% endfor %}
</div>
{% let tags = posts|collect_tags %}<!-- prettier-br -->
{% if !tags.is_empty() %}
<h2>tags</h2>
{% endif %}<!-- prettier-br -->
{% for tag in tags %}
<a href="/?tag={{ tag.0 }}" title="view all posts with this tag">{{ tag.0 }}</a>
<span class="post-author">- {{ tag.1 }} post{% if tag.1 != 1 %}s{%endif %}</span><br />
{% endfor %}
</main>
</body>
</html>

31
templates/macros.askama Normal file
View file

@ -0,0 +1,31 @@
{% macro span_date(value) %}
<span class="{%- match df -%}
{% when DateFormat::RFC3339 %}
date-rfc3339
{% when DateFormat::Strftime(_) %}
{%- endmatch -%}">{{ value|date(df) }}</span>
{% endmacro %}
{% macro table(post) %}
<div class="table">
{% match post.created_at %}
{% when Some(created_at) %}
<div class="created">written</div>
<div class="created value">{% call span_date(created_at) %}</div>
{% when None %}
{% endmatch %}
{% match post.modified_at %}
{% when Some(modified_at) %}
<div class="modified">last modified</div>
<div class="modified value">{% call span_date(modified_at) %}</div>
{% when None %}
{% endmatch %}
{% if !post.tags.is_empty() %}
<div class="tags">tags</div>
<div class="tags value">
{% for tag in post.tags %}
<a href="/?tag={{ tag }}" title="view all posts with this tag">{{ tag }}</a>
{% endfor %}
</div>
{% endif %}
</div>
{% endmacro %}

View file

@ -1,55 +0,0 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="author" content="{{meta.author}}" />
<meta name="keywords" content="{{joined_tags}}" />
<meta name="description" content="{{meta.title}}" />
<!-- you know what I really love? platforms like discord
favoring twitter embeds over the open standard. to color
your embed or have large images, you have to do _this_. lmao -->
<meta property="og:title" content="{{meta.title}}" />
<meta property="twitter:title" content="{{meta.title}}" />
<meta property="og:description" content="{{meta.description}}" />
<meta property="twitter:description" content="{{meta.description}}" />
{{#if (ne meta.icon null)}}
<meta property="og:image" content="{{meta.icon}}" />
<meta name="twitter:card" content="summary_large_image" />
<meta property="twitter:image:src" content="{{meta.icon}}" />
{{#if (ne meta.icon_alt null)}}
<meta property="og:image:alt" content="{{meta.icon_alt}}" />
<meta property="twitter:image:alt" content="{{meta.icon_alt}}" />
{{/if}}{{/if}}
{{#if (ne color null)}}
<meta name="theme-color" content="{{color}}" />
{{/if}}
<title>{{meta.title}}</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/post.css" />
<link rel="stylesheet" href="/static/custom/style.css" />
<link rel="stylesheet" href="/static/custom/post.css" />
{{#if js}}
<script src="/static/date.js" defer></script>
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body>
<main>
<h1 class="post-title">
{{meta.title}}
<span class="post-author">- by {{meta.author}}</span>
</h1>
<p class="post-desc">{{meta.description}}</p>
<div class="post">
{{>post_table meta style=@root.style}}
<a href="/posts/{{meta.name}}">link</a><br />
<a href="/">back to home</a>
</div>
<hr />
{{{rendered}}}
</main>
<footer>
{{>footer}}
</footer>
</body>
</html>

51
templates/post.html Normal file
View file

@ -0,0 +1,51 @@
{%- import "macros.askama" as macros -%}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="{{ meta.title }}" />
<meta property="og:title" content="{{ meta.title }}" />
<meta property="og:description" content="{{ meta.description }}" />
{% match meta.icon %} {% when Some with (url) %}
<meta property="og:image" content="{{ url }}" />
<link rel="shortcut icon" href="{{ url }}" />
{% when None %} {% endmatch %}
<title>{{ meta.title }}</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/post.css" />
{% if js %}
<script src="/static/main.js" defer></script>
{% endif %}
</head>
<body>
<main>
<h1 class="post-title">
{{ meta.title }}
<span class="post-author">- by {{ meta.author }}</span>
</h1>
<p class="post-desc">{{ meta.description }}</p>
<div class="post">
<!-- prettier-ignore -->
{% call macros::table(meta) %}
<a href="/posts/{{ meta.name }}">link</a><br />
<a href="/">back to home</a>
</div>
<hr />
{{ rendered|escape("none") }}
</main>
<!-- prettier-ignore -->
<footer>
{% match rendered_in %}
{% when RenderStats::ParsedAndRendered(total, parsing, rendering) %}
<span class="tooltipped" title="parsing took {{ parsing|duration }}">parsed</span> and
<span class="tooltipped" title="rendering took {{ rendering|duration }}">rendered</span> in {{ total|duration }}
{% when RenderStats::Cached(total) %}
retrieved from cache in {{ total|duration }}
{% endmatch %}
{% if markdown_access %}
- <a href="/posts/{{ meta.name }}.md">view raw</a>
{% endif %}
</footer>
</body>
</html>

0
templates/post_list.html Normal file
View file

View file

View file

@ -1,18 +0,0 @@
<div class="table">
{{#if (and (ne this.written_at null) style.display_dates.creation)}}
<div class="created">written</div>
<div class="created value">{{>span_date dt=this.written_at df=style.date_format}}</div>
{{/if}}
{{#if (and (ne this.modified_at null) style.display_dates.modification)}}
<div class="modified">last modified</div>
<div class="modified value">{{>span_date dt=this.modified_at df=style.date_format}}</div>
{{/if}}
{{#if (gt (len this.tags) 0)}}
<div class="tags">tags</div>
<div class="tags value">
{{#each this.tags}}
<a href="/?tag={{this}}" title="view all posts with this tag">{{this}}</a>
{{/each}}
</div>
{{/if}}
</div>

View file

@ -1 +0,0 @@
<span class="date {{#if (eq df "RFC3339")}}date-rfc3339{{/if}}">{{date dt df}}</span>

View file

@ -1,2 +0,0 @@
<h1>{{title}}</h1>
<p>{{description}}</p>