Compare commits

..

80 commits
main ... main

Author SHA1 Message Date
9fb372574d
add todo for credits 2024-08-31 02:54:52 +03:00
35ea2679a7
add todo 2024-08-17 23:13:53 +03:00
cf89b8db7f
add styles for blockquotes 2024-08-17 18:44:59 +03:00
f7977412bc
fix links 2024-08-17 10:47:48 +03:00
74f7ba968a
add toggles for dates ("grr its too complicated") 2024-08-14 14:00:52 +03:00
e50501c588
Merge branch 'custom_content' 2024-08-13 17:05:08 +03:00
3e32257e56
implement configuration for the custom templates directory and change the default 2024-08-13 17:02:33 +03:00
99e91db6aa
update documentation 2024-08-13 16:53:39 +03:00
2eb14be977
implement custom static loading 2024-08-13 16:06:33 +03:00
9d91e829c8
switch to handlebars 2024-08-13 15:53:18 +03:00
0e97ffaeb8
format markdown 2024-08-03 10:28:41 +03:00
11ac810bce
update info about feature requests 2024-08-03 10:28:41 +03:00
7fc60fdc5e
format markdown 2024-08-03 10:28:01 +03:00
e3bfa2f53f
load custom css from /static/custom 2024-08-03 10:26:38 +03:00
af07b57dc6 update info about feature requests 2024-08-03 01:13:02 +03:00
2a4bef84b1
dont show sort form if there is no javascript 2024-08-02 13:21:09 +03:00
5dcaf85984
fix race condition properly and ship less js 2024-08-01 13:46:03 +03:00
96922f2483
maybe fix race condition 2024-08-01 01:10:08 +03:00
47476ceb3d
add todos 2024-08-01 01:03:51 +03:00
272b2b8d27
fix sorting 2024-08-01 00:52:29 +03:00
757e7fb21a
add clear tags button 2024-08-01 00:31:58 +03:00
17963269f3
correct the cache path in gitignore 2024-08-01 00:27:07 +03:00
00b721caab
implement sorting 2024-08-01 00:25:42 +03:00
02216efb7d
add browser list 2024-07-01 16:38:47 +03:00
ce973f064b
i have done this 2024-07-01 16:14:05 +03:00
8e96858391
i consider this generally cleaned up 2024-07-01 16:13:31 +03:00
6f7b9b7350
remove unused templates 2024-07-01 03:24:11 +03:00
a8a1dca444
add alt text 2024-07-01 03:21:33 +03:00
cee11ba07a
add TODO 2024-07-01 03:17:21 +03:00
38d93a66ba
add color to index 2024-07-01 03:16:17 +03:00
3623b61fbe
optimization 2024-07-01 03:14:26 +03:00
bd093e7c20
add default color 2024-07-01 02:53:04 +03:00
658ddaf820
document options 2024-07-01 02:44:43 +03:00
342a353b36
better post icon and color support 2024-07-01 02:34:40 +03:00
602f57581a
remove shortcut icon 2024-07-01 02:24:32 +03:00
41228d55b6
add more meta tags 2024-06-29 03:01:48 +03:00
2544184251
move sigterm functionality out 2024-06-13 23:43:34 +03:00
7ad03e91bc
move sigterm functionality out 2024-06-13 23:43:03 +03:00
c65225698f
don't align text to right 2024-06-13 23:22:18 +03:00
9dfe0ebddf
fixup app.rs from merge confliict 2024-06-13 23:19:47 +03:00
1a6dcc2c17
Merge branch 'refactor' 2024-06-13 23:19:16 +03:00
84932c0d1e
add custom date formatting and client side date formatting 2024-06-13 21:52:18 +03:00
bd7823dc14
replace single char string pattern with char pattern 2024-05-26 20:24:08 +03:00
516e791ad6
advertise rss 2024-05-26 20:23:40 +03:00
cf102126b3
move the rest of markdown-related stuff into it's own file 2024-05-14 12:27:18 +03:00
897e1cbf88
move MarkdownPosts into it's own file 2024-05-14 10:23:40 +03:00
cc41ba9421
refactor part 2: create PostManager trait 2024-05-14 10:11:41 +03:00
a7b5472fc6
clean up new function 2024-05-09 11:30:18 +03:00
a19c576275
refactor part 1: move code 2024-05-08 23:03:10 +03:00
c1e1670db3
add note when there are no posts 2024-05-04 00:24:29 +03:00
95cd0323a2
fix config env var 2024-05-04 00:18:40 +03:00
759a792043
add todo 2024-05-02 20:20:07 +03:00
8678758440
remove intermediate template and cache rendered markdown directly instead 2024-05-02 20:17:46 +03:00
ad2a8c6ba4
only apply tag to rss 2024-05-02 20:09:28 +03:00
457692f766
add rss 2024-05-02 19:23:20 +03:00
086ddb7665
remove arc unwrapping 2024-05-02 17:44:01 +03:00
3be39df282
tick another entry 2024-05-01 23:23:19 +03:00
d466f531eb
add tags and cache versioning 2024-05-01 23:12:52 +03:00
37c344b53c
add prettierrc 2024-05-01 22:52:24 +03:00
f86165ab94
unwhite tracing 2024-05-01 19:53:34 +03:00
d2976b2684
optimize imports & features 2024-05-01 18:59:34 +03:00
573ea75167
revamp config and add tags support 2024-05-01 18:25:01 +03:00
2fa22a2752
fix: add systemtime_as_secs 2024-04-30 17:34:46 +03:00
3ae6d79296
add tags to todo 2024-04-30 17:22:31 +03:00
9ba687bdae
remove unused feature 2024-04-30 11:44:40 +03:00
4ac5223149
update docs to recommend nightly 2024-04-30 11:44:00 +03:00
6a92c1713d
cache cleanup for all! we can enable persistence now 2024-04-30 11:41:35 +03:00
b9f6d98d49
always follow syntect theme 2024-04-30 11:14:39 +03:00
91b48850db
switch to scc's serde implementation 2024-04-30 09:38:50 +03:00
0b762a36f3
update todo 2024-04-30 09:34:51 +03:00
2c7fef312a
remove unnecessary nightly features and precompression deps 2024-04-23 23:12:44 +03:00
964bae736f
remove compression goal 2024-04-23 21:09:21 +03:00
b0006664e6
remove syntect-to-css 2024-04-23 16:21:52 +03:00
76f4e0358b
remove old options 2024-04-21 01:45:00 +03:00
64954a3d5c
respect compession level 2024-04-21 00:46:28 +03:00
5930df6609
remove redundant code 2024-04-21 00:38:30 +03:00
9046ac910d
change posterror back to apperror 2024-04-20 23:46:20 +03:00
07e6b4e844 Merge pull request 'update readme.md' (#1) from pandarose/bingus-blog:main into main
Reviewed-on: slonk/bingus-blog#1
2024-04-20 23:37:02 +03:00
ed1a858d51
add compression to cache and remove precompression 2024-04-20 23:02:23 +03:00
18385d3e57
make cache optional 2024-04-20 20:59:00 +03:00
43 changed files with 2756 additions and 1497 deletions

2
.gitignore vendored
View file

@ -3,5 +3,5 @@
/media/*
/posts/*
!/posts/README.md
/.slbg-cache
/cache
/config.toml

5
.prettierrc Normal file
View file

@ -0,0 +1,5 @@
{
"useTabs": true,
"tabWidth": 4,
"printWidth": 140
}

51
BUILDING.md Normal file
View file

@ -0,0 +1,51 @@
# Building bingus-blog
this guide assumes you have git and are on linux.
at the moment, compiling on windows is supported, but not _for windows_.
1. first, acquire _rust nightly_.
the recommended method is to install [rustup](https://rustup.rs/),
and use that to get _rust nightly_. choose "customize installation",
and set "default toolchain" to nightly to save time later, provided
you do not need _rust stable_ for something else
2. start your favorite terminal
3. then, download the repository: `git clone https://git.slonk.ing/slonk/bingus-blog && cd bingus-blog`
4. finally, build the application: `cargo +nightly build --release`
5. your executable is `target/release/bingus-blog`, copy it to your server and
you're done!
## Building for another architecture
you can use the `--target` flag in `cargo build` for this purpose.
examples are for Arch Linux x86_64.
here's how to compile for `aarch64-unknown-linux-gnu`
(eg. Oracle CI Free Tier ARM VPS):
```sh
# install the required packages to compile and link aarch64 binaries
sudo pacman -S aarch64-linux-gnu-gcc
cargo +nightly build --release --target=aarch64-unknown-linux-gnu
```
your executable will be `target/aarch64-unkown-linux-gnu/release/bingus-blog`.
---
a more tricky example is building for `aarch64-unknown-linux-musl`
(eg. a Redmi 5 Plus running postmarketOS):
```sh
# there is no toolchain for aarch64-unknown-linux-musl,
# so we have to repurpose the GNU toolchain. this doesn't
# work out of the box so we have to set some environment variables
sudo pacman -S aarch64-linux-gnu-gcc
export CC=aarch64-linux-gnu-gcc
export CARGO_TARGET_AARCH64_UNKNOWN_LINUX_MUSL_LINKER=$CC
cargo +nightly build --release --target=aarch64-unknown-linux-musl
# the reason we had to do this is because cargo tries to use
# the same toolchain as the target's name. but we can tell it to use
# the GNU one like so.
```
your executable will be `target/aarch64-unkown-linux-musl/release/bingus-blog`.

67
CONFIG.md Normal file
View file

@ -0,0 +1,67 @@
# Configuration
the configuration format, with defaults, is documented below:
```toml
title = "bingus-blog" # title of the blog
# description of the blog
description = "blazingly fast markdown blog software written in rust memory safe"
markdown_access = true # allow users to see the raw markdown of a post
# endpoint: /posts/<name>.md
js_enable = true # enable javascript (required for sorting and dates)
[style]
date_format = "RFC3339" # format string used to format dates in the backend
# it's highly recommended to leave this as default,
# so the date can be formatted by the browser.
# format: https://docs.rs/chrono/latest/chrono/format/strftime/index.html#specifiers
default_sort = "date" # default sorting method ("date" or "name")
#default_color = "#f5c2e7" # default embed color, optional
[style.display_dates]
creation = true # display creation ("written") dates
modification = true # display modified ("last modified") dates
[rss]
enable = false # serve an rss field under /feed.xml
# this may be a bit resource intensive
link = "https://..." # public url of the blog, required if rss is enabled
[dirs]
posts = "posts" # where posts are stored
media = "media" # directory served under /media/
custom_templates = "templates" # custom templates dir
custom_static = "static" # custom static dir
# see CUSTOM.md for documentation
[http]
host = "0.0.0.0" # ip to listen on
port = 3000 # port to listen on
[cache]
enable = true # save metadata and rendered posts into RAM
# highly recommended, only turn off if absolutely necessary
cleanup = true # clean cache, highly recommended
#cleanup_interval = 86400000 # clean the cache regularly instead of just at startup
# uncomment to enable
persistence = true # save the cache to on shutdown and load on startup
file = "cache" # file to save the cache to
compress = true # compress the cache file
compression_level = 3 # zstd compression level, 3 is recommended
[render]
syntect.load_defaults = false # include default syntect themes
syntect.themes_dir = "themes" # directory to include themes from
syntect.theme = "Catppuccin Mocha" # theme file name (without `.tmTheme`)
```
configuration is done in [TOML](https://toml.io/)
if an option marked "optional" is not set, it will not be initialized with
a default value
you don't have to copy the whole thing from here,
it's generated by the program if it doesn't exist
## Specifying the configuration file
the configuration file is loaded from `config.toml` by default, but the path
can be overriden by setting the environment variable `BINGUS_BLOG_CONFIG`,
which will make bingus-blog try to read that file or fail and exit.

49
CUSTOM.md Normal file
View file

@ -0,0 +1,49 @@
# Custom Content
bingus-blog supports loading custom content such as templates and static files
at runtime from custom locations.
the configuration options `dirs.custom_templates` and `dirs.custom_static`
allow you to set where these files are loaded from.
customizing the error page, other than CSS, is not supported at this time.
## Custom Templates
custom templates are written in
[Handlebars (the rust variant)](https://crates.io/crates/handlebars).
the *custom templates directory* has a non-recursive structure:
```md
./
- index.html # ignored
- index.hbs # loaded as `index`
- post.hbs # loaded as `post`
- [NAME].hbs # loaded as `[NAME]`
- ...
```
templates will be loaded from first, the executable, then, the custom
templates path, overriding the defaults.
template changes are also processed after startup, any changed template will be
compiled and will replace the existing template in the registry, or add a
new one (though that does nothing).
if a template is deleted, the default template will be recompiled into
it's place.
note that the watcher only works if the *custom templates directory* existed
at startup. if you delete/create the directory, you must restart the program.
## Custom Static Files
GET requests to `/static` will first be checked against `dirs.custom_static`.
if the file is not found in the *custom static directory*, bingus-blog will try
to serve it from the directory embedded in the executable. this means you can
add whatever you want in the *custom static directory* and it will be served
under `/static`.
## Custom Media
the endpoint `/media` is served from `dirs.media`. no other logic or mechanism
is present.

1191
Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -4,42 +4,59 @@ version = "0.1.0"
edition = "2021"
default-run = "bingus-blog"
[[bin]]
name = "syntect-to-css"
required-features = ["clap"]
[features]
default = ["precompression"]
default = []
tokio-console = ["dep:console-subscriber"]
clap = ["dep:clap"]
precompression = ["dep:async-compression"]
[profile.release]
lto = "fat"
opt-level = 3
codegen-units = 1
strip = true
[dependencies]
askama = { version = "0.12.1", features = ["with-axum"] }
askama_axum = "0.4.0"
async-compression = { version = "0.4.8", optional = true }
axum = { version = "0.7.5", features = ["macros"] }
bitcode = { version = "0.6.0", features = ["serde"] }
chrono = { version = "0.4.37", features = ["serde"] }
clap = { version = "4.5.4", features = ["derive"], optional = true }
askama = { version = "0.12.1", features = [
"with-axum",
], default-features = false }
askama_axum = { version = "0.4.0", default-features = false }
axum = { version = "0.7.5", features = [
"http1",
"json",
"query",
"macros",
"tokio",
"tracing",
], default-features = false }
bitcode = { version = "0.6.0", features = ["serde"], default-features = false }
chrono = { version = "0.4.37", features = [
"std",
"serde",
], default-features = false }
color-eyre = "0.6.3"
comrak = { version = "0.22.0", features = ["syntect"] }
comrak = { version = "0.22.0", features = [
"syntect",
], default-features = false }
console-subscriber = { version = "0.2.0", optional = true }
derive_more = "0.99.17"
fronma = "0.2.0"
notify = "6.1.1"
scc = "2.1.0"
handlebars = "6.0.0"
include_dir = "0.7.4"
mime_guess = "2.0.5"
notify-debouncer-full = { version = "0.3.1", default-features = false }
rss = "2.0.7"
scc = { version = "2.1.0", features = ["serde"] }
serde = { version = "1.0.197", features = ["derive"] }
serde_json = { version = "1.0.124", features = ["preserve_order"] }
syntect = "5.2.0"
thiserror = "1.0.58"
tokio = { version = "1.37.0", features = ["full"] }
tokio-util = "0.7.10"
tokio = { version = "1.37.0", features = [
"fs",
"macros",
"rt-multi-thread",
"signal",
] }
tokio-util = { version = "0.7.10", default-features = false }
toml = "0.8.12"
tower = "0.4.13"
tower-http = { version = "0.5.2", features = [
"compression-gzip",
"fs",
@ -47,3 +64,5 @@ tower-http = { version = "0.5.2", features = [
], default-features = false }
tracing = "0.1.40"
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }
url = { version = "2.5.0", features = ["serde"] }
zstd = { version = "0.13.1", default-features = false }

132
README.md
View file

@ -1,7 +1,7 @@
---
title: "README"
description: "the README.md file of this project"
author: "slonkazoid"
title: README
description: the README.md file of this project
author: slonkazoid
created_at: 2024-04-18T04:15:26+03:00
---
@ -9,76 +9,52 @@ created_at: 2024-04-18T04:15:26+03:00
blazingly fast markdown blog software written in rust memory safe
for bingus-blog viewers: [see original document](https://git.slonk.ing/slonk/bingus-blog)
## Features
- posts are written in markdwon and loaded at runtime, meaning you
can write posts from anywhere and sync it with the server without headache
- RSS is supported
- the look of the blog is extremely customizable, with support for
[custom drop-ins](CUSTOM.md) for both templates and static content
- really easy to deploy (the server is one executable file)
- blazingly fast
## TODO
- [ ] RSS
- [x] finish writing this document
- [x] document config
- [ ] blog thumbnail and favicon
- [ ] sort asc/desc
- [ ] extend syntect options
- [ ] general cleanup of code
- [ ] make `compress.rs` not suck
- [ ] better error reporting and pages
- [ ] ^ fix syntect mutex poisoning
- [ ] better error reporting and error pages
- [ ] better tracing
- [ ] cache cleanup task
- [ ] (de)compress cache with zstd on startup/shutdown
- [ ] replace HashMap with HashCache once i implement [this](https://github.com/wvwwvwwv/scalable-concurrent-containers/issues/139)
- [ ] make date parsing less strict
- [ ] make date formatting better
- [ ] clean up imports and require less features
- [ ] improve home page
- [ ] multi-language support
- [ ] add credits
- [x] be blazingly fast
- [x] 100+ MiB binary size
## Configuration
the default configuration with comments looks like this
see [CONFIG.md](CONFIG.md)
```toml
# main settings
host = "0.0.0.0" # ip to listen on
port = 3000 # port to listen on
title = "bingus-blog" # title of the website
description = "blazingly fast markdown blog software written in rust memory safe" # description of the website
posts_dir = "posts" # where posts are stored
#cache_file = "..." # file to serialize the cache into on shutdown, and
# to deserialize from on startup. uncomment to enable
markdown_access = true # allow users to see the raw markdown of a post
## Building
[render] # rendering-specific settings
syntect.load_defaults = false # include default syntect themes
syntect.themes_dir = "themes" # directory to include themes from
syntect.theme = "Catppuccin Mocha" # theme file name (without `.tmTheme`)
[precompression] # precompression settings
enable = false # gzip every file in static/ on startup
watch = true # keep watching and gzip files as they change
```
you don't have to copy it from here, it's generated if it doesn't exist
## Usage
this project uses nightly-only features.
make sure you have the nightly toolchain installed.
build the application with `cargo`:
```sh
cargo build --release
cargo +nightly build --release
```
the executable will be located at `target/release/bingus-blog`.
### Building for another architecture
you can use the `--target` flag in `cargo build` for this purpose
building for `aarch64-unknown-linux-musl` (for example, a Redmi 5 Plus running postmarketOS):
```sh
# install the required packages to compile and link aarch64 binaries
sudo pacman -S aarch64-linux-gnu-gcc
export CC=aarch64-linux-gnu-gcc
export CARGO_TARGET_AARCH64_UNKNOWN_LINUX_MUSL_LINKER=$CC
cargo build --release --target=aarch64-unknown-linux-musl
```
your executable will be located at `target/<target>/release/bingus-blog` this time.
see [BUILDING.md](BUILDING.md) for more information and detailed instructions.
## Writing Posts
@ -97,15 +73,22 @@ every post **must** begin with a **valid** front matter. else it wont be listed
in / & /posts, and when you navigate to it, you will be met with an error page.
the error page will tell you what the problem is.
example:
full example:
```md
---
title: "README"
description: "the README.md file of this project"
author: "slonkazoid"
created_at: 2024-04-18T04:15:26+03:00
#modified_at: ... # see above
title: My first post # title of the post
description: The first post on this awesome blog! # short description of the post
author: Blubber256 # author of the post
icon: /media/first-post/icon.png # icon/thumbnail of post used in embeds
icon_alt: Picture of a computer running DOOM
color: "#00aacc" # color of post, also used in embeds
created_at: 2024-04-18T04:15:26+03:00 # date of writing, this is highly
# recommended if you are on a system which doesnt have btime (like musl),
# because this is fetched from file stats by default
#modified_at: ... # see above. this is also fetched from the filesystem
tags: # tags, or keywords, used in meta and also in the ui
- lifestyle
---
```
@ -124,10 +107,39 @@ standard. examples of valid and invalid dates:
- # everything else is also invalid
```
## Routes
## Non-static Routes
- `GET /`: index page, lists posts
- `GET /posts`: returns a list of all posts with metadata in JSON format
- `GET /posts/<name>`: view a post
- `GET /posts/<name>.md`: view the raw markdown of a post
- `GET /post/*`: redirects to `/posts/*`
- `GET /feed.xml`: RSS feed
## Cache
bingus-blog caches every post retrieved and keeps it permanently in cache.
there is a toggleable cleanup task that periodically sweeps the cache to
remove dead entries, but it can still get quite big.
if cache persistence is on, the cache is (compressed &) written to disk on
shutdown, and read (& decompressed) on startup. one may opt to set the cache
location to point to a tmpfs to make it save and load quickly, but not persist
across reboots at the cost of more RAM usage.
in my testing, the compression reduced a 3.21 MB cache to 0.18 MB almost
instantly. there is basically no good reason to not have compression on,
unless you have filesystem compression already of course.
## Contributing
make sure your changes don't break firefox, chromium,text-based browsers,
and webkit support
### Feature Requests
i want this project to be a good and usable piece of software, so i implement
feature requests provided they fit the project and it's values.
most just ping me on discord with feature requests, but if your request is
non-trivial, please create an issue [here](https://git.slonk.ing/slonk/bingus-blog/issues).

18
partials/post_table.hbs Normal file
View file

@ -0,0 +1,18 @@
<div class="table">
{{#if (and (ne this.created_at null) style.display_dates.creation)}}
<div class="created">written</div>
<div class="created value">{{>span_date dt=this.created_at df=style.date_format}}</div>
{{/if}}
{{#if (and (ne this.modified_at null) style.display_dates.modification)}}
<div class="modified">last modified</div>
<div class="modified value">{{>span_date dt=this.modified_at df=style.date_format}}</div>
{{/if}}
{{#if (gt (len this.tags) 0)}}
<div class="tags">tags</div>
<div class="tags value">
{{#each this.tags}}
<a href="/?tag={{this}}" title="view all posts with this tag">{{this}}</a>
{{/each}}
</div>
{{/if}}
</div>

1
partials/span_date.hbs Normal file
View file

@ -0,0 +1 @@
<span class="date {{#if (eq df "RFC3339")}}date-rfc3339{{/if}}">{{date dt df}}</span>

279
src/app.rs Normal file
View file

@ -0,0 +1,279 @@
use std::collections::HashMap;
use std::sync::Arc;
use std::time::Duration;
use axum::extract::{Path, Query, State};
use axum::http::header::CONTENT_TYPE;
use axum::http::Request;
use axum::response::{IntoResponse, Redirect, Response};
use axum::routing::get;
use axum::{Json, Router};
use handlebars::Handlebars;
use include_dir::{include_dir, Dir};
use rss::{Category, ChannelBuilder, ItemBuilder};
use serde::{Deserialize, Serialize};
use serde_json::Map;
use tokio::sync::RwLock;
use tower::service_fn;
use tower_http::services::ServeDir;
use tower_http::trace::TraceLayer;
use tracing::{info, info_span, Span};
use crate::config::{Config, StyleConfig};
use crate::error::{AppError, AppResult};
use crate::post::{MarkdownPosts, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::serve_dir_included::handle;
const STATIC: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/static");
#[derive(Clone)]
#[non_exhaustive]
pub struct AppState {
pub config: Arc<Config>,
pub posts: Arc<MarkdownPosts<Arc<Config>>>,
pub reg: Arc<RwLock<Handlebars<'static>>>,
}
#[derive(Serialize)]
struct IndexTemplate<'a> {
title: &'a str,
description: &'a str,
posts: Vec<PostMetadata>,
rss: bool,
js: bool,
tags: Map<String, serde_json::Value>,
joined_tags: String,
style: &'a StyleConfig,
}
#[derive(Serialize)]
struct PostTemplate<'a> {
meta: &'a PostMetadata,
rendered: String,
rendered_in: RenderStats,
markdown_access: bool,
js: bool,
color: Option<&'a str>,
joined_tags: String,
style: &'a StyleConfig,
}
#[derive(Deserialize)]
struct QueryParams {
tag: Option<String>,
#[serde(rename = "n")]
num_posts: Option<usize>,
}
fn collect_tags(posts: &Vec<PostMetadata>) -> Map<String, serde_json::Value> {
let mut tags = HashMap::new();
for post in posts {
for tag in &post.tags {
if let Some((existing_tag, count)) = tags.remove_entry(tag) {
tags.insert(existing_tag, count + 1);
} else {
tags.insert(tag.clone(), 1);
}
}
}
let mut tags: Vec<(String, u64)> = tags.into_iter().collect();
tags.sort_unstable_by_key(|(v, _)| v.clone());
tags.sort_by_key(|(_, v)| -(*v as i64));
let mut map = Map::new();
for tag in tags.into_iter() {
map.insert(tag.0, tag.1.into());
}
map
}
fn join_tags_for_meta(tags: &Map<String, serde_json::Value>, delim: &str) -> String {
let mut s = String::new();
let tags = tags.keys().enumerate();
let len = tags.len();
for (i, tag) in tags {
s += tag;
if i != len - 1 {
s += delim;
}
}
s
}
async fn index<'a>(
State(AppState {
config, posts, reg, ..
}): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<impl IntoResponse> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(query.num_posts, query.tag.as_ref())
.await?;
let tags = collect_tags(&posts);
let joined_tags = join_tags_for_meta(&tags, ", ");
let reg = reg.read().await;
let rendered = reg.render(
"index",
&IndexTemplate {
title: &config.title,
description: &config.description,
posts,
rss: config.rss.enable,
js: config.js_enable,
tags,
joined_tags,
style: &config.style,
},
);
drop(reg);
Ok(([(CONTENT_TYPE, "text/html")], rendered?))
}
async fn all_posts(
State(AppState { posts, .. }): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<Json<Vec<PostMetadata>>> {
let posts = posts
.get_max_n_post_metadata_with_optional_tag_sorted(query.num_posts, query.tag.as_ref())
.await?;
Ok(Json(posts))
}
async fn rss(
State(AppState { config, posts, .. }): State<AppState>,
Query(query): Query<QueryParams>,
) -> AppResult<Response> {
if !config.rss.enable {
return Err(AppError::RssDisabled);
}
let posts = posts
.get_all_posts(|metadata, _| {
!query
.tag
.as_ref()
.is_some_and(|tag| !metadata.tags.contains(tag))
})
.await?;
let mut channel = ChannelBuilder::default();
channel
.title(&config.title)
.link(config.rss.link.to_string())
.description(&config.description);
//TODO: .language()
for (metadata, content, _) in posts {
channel.item(
ItemBuilder::default()
.title(metadata.title)
.description(metadata.description)
.author(metadata.author)
.categories(
metadata
.tags
.into_iter()
.map(|tag| Category {
name: tag,
domain: None,
})
.collect::<Vec<Category>>(),
)
.pub_date(metadata.created_at.map(|date| date.to_rfc2822()))
.content(content)
.link(
config
.rss
.link
.join(&format!("/posts/{}", metadata.name))?
.to_string(),
)
.build(),
);
}
let body = channel.build().to_string();
drop(channel);
Ok(([(CONTENT_TYPE, "text/xml")], body).into_response())
}
async fn post(
State(AppState {
config, posts, reg, ..
}): State<AppState>,
Path(name): Path<String>,
) -> AppResult<impl IntoResponse> {
match posts.get_post(&name).await? {
ReturnedPost::Rendered(ref meta, rendered, rendered_in) => {
let joined_tags = meta.tags.join(", ");
let reg = reg.read().await;
let rendered = reg.render(
"post",
&PostTemplate {
meta,
rendered,
rendered_in,
markdown_access: config.markdown_access,
js: config.js_enable,
color: meta
.color
.as_deref()
.or(config.style.default_color.as_deref()),
joined_tags,
style: &config.style,
},
);
drop(reg);
Ok(([(CONTENT_TYPE, "text/html")], rendered?).into_response())
}
ReturnedPost::Raw(body, content_type) => {
Ok(([(CONTENT_TYPE, content_type)], body).into_response())
}
}
}
pub fn new(config: &Config) -> Router<AppState> {
Router::new()
.route("/", get(index))
.route(
"/post/:name",
get(
|Path(name): Path<String>| async move { Redirect::to(&format!("/posts/{}", name)) },
),
)
.route("/posts/:name", get(post))
.route("/posts", get(all_posts))
.route("/feed.xml", get(rss))
.nest_service(
"/static",
ServeDir::new(&config.dirs.custom_static)
.precompressed_gzip()
.fallback(service_fn(|req| handle(req, &STATIC))),
)
.nest_service("/media", ServeDir::new(&config.dirs.media))
.layer(
TraceLayer::new_for_http()
.make_span_with(|request: &Request<_>| {
info_span!(
"request",
method = ?request.method(),
path = ?request.uri().path(),
)
})
.on_response(|response: &Response<_>, duration: Duration, span: &Span| {
let _ = span.enter();
let status = response.status();
info!(?status, ?duration, "response");
}),
)
}

View file

@ -1,20 +0,0 @@
use std::{
ffi::{OsStr, OsString},
path::{Path, PathBuf},
};
// i will kill you rust stdlib
pub trait Append<T>
where
Self: Into<OsString>,
T: From<OsString>,
{
fn append(self, ext: impl AsRef<OsStr>) -> T {
let mut buffer: OsString = self.into();
buffer.push(ext.as_ref());
T::from(buffer)
}
}
impl Append<PathBuf> for PathBuf {}
impl Append<PathBuf> for &Path {}

View file

@ -1,76 +0,0 @@
use std::fs::File;
use std::io::BufReader;
use std::path::PathBuf;
use clap::Parser;
use color_eyre::eyre::{self, Context, Ok, OptionExt};
use syntect::highlighting::{Theme, ThemeSet};
use syntect::html::{css_for_theme_with_class_style, ClassStyle};
#[derive(Parser, Debug)]
#[command(about = "generate CSS from a syntect theme")]
struct Args {
#[command(subcommand)]
command: Command,
#[arg(
short,
long,
help = "prefix for generated classes",
default_value = "syntect-"
)]
prefix: String,
#[arg(
long,
help = "don't add a prefix to generated classes",
default_value_t = false
)]
no_prefix: bool,
}
#[derive(Parser, Debug)]
enum Command {
#[command(about = "generate CSS from a theme in the default theme set")]
Default {
#[arg(help = "name of theme (no .tmTheme)")]
theme_name: String,
},
#[command(about = "generate CSS from a .tmTheme file")]
File {
#[arg(help = "path to theme (including .tmTheme)")]
path: PathBuf,
},
}
fn main() -> eyre::Result<()> {
let args = Args::parse();
color_eyre::install()?;
let theme = match args.command {
Command::Default { theme_name } => {
let ts = ThemeSet::load_defaults();
ts.themes
.get(&theme_name)
.ok_or_eyre(format!("theme {:?} doesn't exist", theme_name))?
.to_owned()
}
Command::File { path } => {
let mut file = BufReader::new(
File::open(&path).with_context(|| format!("failed to open {:?}", path))?,
);
ThemeSet::load_from_reader(&mut file).with_context(|| "failed to parse theme")?
}
};
let class_style = if args.no_prefix {
ClassStyle::Spaced
} else {
ClassStyle::SpacedPrefixed {
prefix: args.prefix.leak(),
}
};
let css = css_for_theme_with_class_style(&theme, class_style)
.with_context(|| "failed to generate css")?;
println!("{css}");
Ok(())
}

View file

@ -1,60 +0,0 @@
// TODO: make this bearable
use std::{
fs::{self, Metadata},
io::{self, Result},
path::Path,
process::{Child, Command},
sync::Mutex,
};
fn compress_file(path: &Path, metadata: Metadata, handles: &Mutex<Vec<Child>>) -> Result<()> {
let compressed_file = format!("{}.gz", path.to_str().unwrap());
if match fs::metadata(compressed_file) {
Ok(existing_metadata) => metadata.modified()? > existing_metadata.modified()?,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => true,
_ => return Err(err),
},
} {
let mut handles_guard = handles.lock().unwrap();
handles_guard.push(Command::new("gzip").arg("-kf5").arg(path).spawn()?);
}
Ok(())
}
fn compress_recursively(path: &Path, handles: &Mutex<Vec<Child>>) -> Result<()> {
let metadata = fs::metadata(path)?;
if metadata.is_dir() {
for entry in fs::read_dir(path)? {
compress_recursively(&entry?.path(), handles)?
}
Ok(())
} else if match path.extension() {
Some(ext) => ext == "gz",
None => false,
} || metadata.is_symlink()
{
Ok(())
} else {
compress_file(path, metadata, handles)
}
}
pub fn compress_epicly<P: AsRef<Path>>(path: P) -> Result<u64> {
let mut i = 0;
let handles = Mutex::new(Vec::new());
compress_recursively(AsRef::<Path>::as_ref(&path), &handles)?;
let handles = handles.into_inner().unwrap();
for mut handle in handles {
assert!(handle.wait().unwrap().success());
i += 1;
}
Ok(i)
}

View file

@ -1,91 +1,199 @@
use std::{
env,
net::{IpAddr, Ipv4Addr},
path::PathBuf,
};
use std::env;
use std::net::{IpAddr, Ipv6Addr};
use std::path::PathBuf;
use color_eyre::eyre::{bail, Context, Result};
use serde::{Deserialize, Serialize};
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tracing::{error, info};
use tracing::{error, info, instrument};
use url::Url;
use crate::ranged_i128_visitor::RangedI128Visitor;
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Hash)]
#[serde(default)]
pub struct SyntectConfig {
pub load_defaults: bool,
pub themes_dir: Option<PathBuf>,
pub theme: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Hash)]
#[derive(Serialize, Deserialize, Debug, Clone, PartialEq, Eq, Hash, Default)]
#[serde(default)]
pub struct RenderConfig {
pub syntect: SyntectConfig,
}
#[cfg(feature = "precompression")]
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct PrecompressionConfig {
pub struct CacheConfig {
pub enable: bool,
pub watch: bool,
pub cleanup: bool,
pub cleanup_interval: Option<u64>,
pub persistence: bool,
pub file: PathBuf,
pub compress: bool,
#[serde(deserialize_with = "check_zstd_level_bounds")]
pub compression_level: i32,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct HttpConfig {
pub host: IpAddr,
pub port: u16,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct DirsConfig {
pub posts: PathBuf,
pub media: PathBuf,
pub custom_static: PathBuf,
pub custom_templates: PathBuf,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
pub struct RssConfig {
pub enable: bool,
pub link: Url,
}
#[derive(Serialize, Deserialize, Debug, Clone, Default)]
pub enum DateFormat {
#[default]
RFC3339,
#[serde(untagged)]
Strftime(String),
}
#[derive(Serialize, Deserialize, Debug, Clone, Default, Copy, PartialEq, Eq)]
#[serde(rename_all = "lowercase")]
#[repr(u8)]
pub enum Sort {
#[default]
Date,
Name,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
#[derive(Default)]
pub struct StyleConfig {
pub display_dates: DisplayDates,
pub date_format: DateFormat,
pub default_sort: Sort,
pub default_color: Option<String>,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct DisplayDates {
pub creation: bool,
pub modification: bool,
}
#[derive(Serialize, Deserialize, Debug, Clone)]
#[serde(default)]
pub struct Config {
pub host: IpAddr,
pub port: u16,
pub title: String,
pub description: String,
pub posts_dir: PathBuf,
pub render: RenderConfig,
#[cfg(feature = "precompression")]
pub precompression: PrecompressionConfig,
pub cache_file: Option<PathBuf>,
pub markdown_access: bool,
pub js_enable: bool,
pub style: StyleConfig,
pub rss: RssConfig,
pub dirs: DirsConfig,
pub http: HttpConfig,
pub render: RenderConfig,
pub cache: CacheConfig,
}
impl Default for Config {
fn default() -> Self {
Self {
host: IpAddr::V4(Ipv4Addr::UNSPECIFIED),
port: 3000,
title: "bingus-blog".into(),
description: "blazingly fast markdown blog software written in rust memory safe".into(),
render: Default::default(),
posts_dir: "posts".into(),
#[cfg(feature = "precompression")]
precompression: Default::default(),
cache_file: None,
markdown_access: true,
js_enable: true,
style: Default::default(),
// i have a love-hate relationship with serde
// it was engimatic at first, but then i started actually using it
// writing my own serialize and deserialize implementations.. spending
// a lot of time in the docs trying to understand each and every option..
// now with this knowledge i can do stuff like this! (see rss field)
// and i'm proud to say that it still makes 0 sense.
rss: RssConfig {
enable: false,
link: Url::parse("http://example.com").unwrap(),
},
dirs: Default::default(),
http: Default::default(),
render: Default::default(),
cache: Default::default(),
}
}
}
impl Default for RenderConfig {
impl Default for DisplayDates {
fn default() -> Self {
Self {
creation: true,
modification: true,
}
}
}
impl Default for DirsConfig {
fn default() -> Self {
Self {
posts: "posts".into(),
media: "media".into(),
custom_static: "static".into(),
custom_templates: "templates".into(),
}
}
}
impl Default for HttpConfig {
fn default() -> Self {
Self {
host: IpAddr::V6(Ipv6Addr::UNSPECIFIED),
port: 3000,
}
}
}
impl Default for SyntectConfig {
fn default() -> Self {
Self {
syntect: SyntectConfig {
load_defaults: false,
themes_dir: Some("themes".into()),
theme: Some("Catppuccin Mocha".into()),
},
}
}
}
#[cfg(feature = "precompression")]
impl Default for PrecompressionConfig {
impl Default for CacheConfig {
fn default() -> Self {
Self {
enable: false,
watch: true,
enable: true,
cleanup: true,
cleanup_interval: None,
persistence: true,
file: "cache".into(),
compress: true,
compression_level: 3,
}
}
}
#[instrument(name = "config")]
pub async fn load() -> Result<Config> {
let config_file = env::var(format!("{}_CONFIG", env!("CARGO_BIN_NAME")))
let config_file = env::var(format!(
"{}_CONFIG",
env!("CARGO_BIN_NAME").to_uppercase().replace('-', "_")
))
.unwrap_or(String::from("config.toml"));
match tokio::fs::OpenOptions::new()
.read(true)
@ -128,3 +236,11 @@ pub async fn load() -> Result<Config> {
},
}
}
fn check_zstd_level_bounds<'de, D>(d: D) -> Result<i32, D::Error>
where
D: serde::Deserializer<'de>,
{
d.deserialize_i32(RangedI128Visitor::<1, 22>)
.map(|x| x as i32)
}

View file

@ -1,9 +1,10 @@
use std::fmt::Display;
use axum::{http::StatusCode, response::IntoResponse};
use askama_axum::Template;
use axum::http::StatusCode;
use axum::response::{IntoResponse, Response};
use thiserror::Error;
// fronma is too lazy to implement std::error::Error for their own types
#[derive(Debug)]
#[repr(transparent)]
pub struct FronmaError(fronma::error::Error);
@ -45,3 +46,47 @@ impl IntoResponse for PostError {
(StatusCode::INTERNAL_SERVER_ERROR, self.to_string()).into_response()
}
}
pub type AppResult<T> = Result<T, AppError>;
#[derive(Error, Debug)]
pub enum AppError {
#[error("failed to fetch post: {0}")]
PostError(#[from] PostError),
#[error(transparent)]
HandlebarsError(#[from] handlebars::RenderError),
#[error("rss is disabled")]
RssDisabled,
#[error(transparent)]
UrlError(#[from] url::ParseError),
}
impl From<std::io::Error> for AppError {
#[inline(always)]
fn from(value: std::io::Error) -> Self {
Self::PostError(PostError::IoError(value))
}
}
#[derive(Template)]
#[template(path = "error.html")]
struct ErrorTemplate {
error: String,
}
impl IntoResponse for AppError {
fn into_response(self) -> Response {
let status_code = match &self {
AppError::PostError(PostError::NotFound(_)) => StatusCode::NOT_FOUND,
AppError::RssDisabled => StatusCode::FORBIDDEN,
_ => StatusCode::INTERNAL_SERVER_ERROR,
};
(
status_code,
ErrorTemplate {
error: self.to_string(),
},
)
.into_response()
}
}

View file

@ -1,11 +0,0 @@
use std::time::Duration;
use chrono::{DateTime, TimeZone};
pub fn date<T: TimeZone>(date: &DateTime<T>) -> Result<String, askama::Error> {
Ok(date.to_rfc3339_opts(chrono::SecondsFormat::Secs, true))
}
pub fn duration(duration: &&Duration) -> Result<String, askama::Error> {
Ok(format!("{:?}", duration))
}

View file

@ -23,27 +23,13 @@ where
}
}
/*pub fn get(&self, key: &Lookup) -> Option<Arc<T>> {
self.hash.and_then(|hash| {
let mut h = DefaultHasher::new();
key.hash(&mut h);
if hash == h.finish() {
self.inner.clone()
} else {
None
}
})
}*/
pub fn get_or_init(&mut self, key: &Lookup, init: impl Fn(&Lookup) -> Arc<T>) -> Arc<T> {
let mut h = DefaultHasher::new();
key.hash(&mut h);
let hash = h.finish();
if !self.hash.is_some_and(|inner_hash| inner_hash == hash) {
let mut h = DefaultHasher::new();
key.hash(&mut h);
self.inner = Some(init(key));
self.hash = Some(h.finish());
self.hash = Some(hash);
}
// safety: please.
unsafe { self.inner.as_ref().unwrap_unchecked().clone() }

24
src/helpers.rs Normal file
View file

@ -0,0 +1,24 @@
use std::fmt::Display;
use std::time::Duration;
use chrono::{DateTime, TimeZone, Utc};
use handlebars::handlebars_helper;
use crate::config::DateFormat;
fn date_impl<T>(date_time: &DateTime<T>, date_format: &DateFormat) -> String
where
T: TimeZone,
T::Offset: Display,
{
match date_format {
DateFormat::RFC3339 => date_time.to_rfc3339_opts(chrono::SecondsFormat::Secs, true),
DateFormat::Strftime(ref format_string) => date_time.format(format_string).to_string(),
}
}
handlebars_helper!(date: |date_time: Option<DateTime<Utc>>, date_format: DateFormat| {
date_impl(date_time.as_ref().unwrap(), &date_format)
});
handlebars_helper!(duration: |duration_: Duration| format!("{:?}", duration_));

View file

@ -1,14 +1,17 @@
#![feature(let_chains, stmt_expr_attributes, proc_macro_hygiene)]
#![feature(let_chains, pattern)]
mod append_path;
mod compress;
mod app;
mod config;
mod error;
mod filters;
mod hash_arc_store;
mod helpers;
mod markdown_render;
mod platform;
mod post;
mod watcher;
mod ranged_i128_visitor;
mod serve_dir_included;
mod systemtime_as_secs;
mod templates;
use std::future::IntoFuture;
use std::net::SocketAddr;
@ -16,276 +19,118 @@ use std::process::exit;
use std::sync::Arc;
use std::time::Duration;
use askama_axum::Template;
use axum::extract::{MatchedPath, Path, State};
use axum::http::{Request, StatusCode};
use axum::response::{IntoResponse, Redirect, Response};
use axum::routing::{get, Router};
use axum::Json;
use color_eyre::eyre::{self, Context};
use thiserror::Error;
use tokio::io::{AsyncReadExt, AsyncWriteExt};
use tokio::net::TcpListener;
use tokio::signal;
use tokio::sync::RwLock;
use tokio::task::JoinSet;
use tokio::time::Instant;
use tokio::{select, signal};
use tokio_util::sync::CancellationToken;
use tower_http::services::ServeDir;
use tower_http::trace::TraceLayer;
use tracing::level_filters::LevelFilter;
use tracing::{error, info, info_span, warn, Span};
use tracing_subscriber::{layer::SubscriberExt, util::SubscriberInitExt, EnvFilter};
use tracing::{debug, error, info, info_span, warn, Instrument};
use tracing_subscriber::layer::SubscriberExt;
use tracing_subscriber::{util::SubscriberInitExt, EnvFilter};
use crate::compress::compress_epicly;
use crate::config::Config;
use crate::error::PostError;
use crate::post::{PostManager, PostMetadata, RenderStats};
use crate::watcher::watch;
type ArcState = Arc<AppState>;
#[derive(Clone)]
struct AppState {
pub config: Config,
pub posts: PostManager,
}
#[derive(Template)]
#[template(path = "index.html")]
struct IndexTemplate {
title: String,
description: String,
posts: Vec<PostMetadata>,
}
#[derive(Template)]
#[template(path = "view_post.html")]
struct ViewPostTemplate {
meta: PostMetadata,
rendered: String,
rendered_in: RenderStats,
markdown_access: bool,
}
type AppResult<T> = Result<T, PostError>;
#[derive(Error, Debug)]
enum AppError {
#[error("failed to fetch post: {0}")]
PostError(#[from] PostError),
}
#[derive(Template)]
#[template(path = "error.html")]
struct ErrorTemplate {
error: String,
}
impl IntoResponse for AppError {
fn into_response(self) -> Response {
let status_code = match &self {
AppError::PostError(err) => match err {
PostError::NotFound(_) => StatusCode::NOT_FOUND,
_ => StatusCode::INTERNAL_SERVER_ERROR,
},
//_ => StatusCode::INTERNAL_SERVER_ERROR,
};
(
status_code,
ErrorTemplate {
error: self.to_string(),
},
)
.into_response()
}
}
async fn index(State(state): State<ArcState>) -> AppResult<IndexTemplate> {
Ok(IndexTemplate {
title: state.config.title.clone(),
description: state.config.description.clone(),
posts: state.posts.list_posts().await?,
})
}
async fn post(State(state): State<ArcState>, Path(name): Path<String>) -> AppResult<Response> {
if name.ends_with(".md") && state.config.markdown_access {
let mut file = tokio::fs::OpenOptions::new()
.read(true)
.open(state.config.posts_dir.join(&name))
.await?;
let mut buf = Vec::new();
file.read_to_end(&mut buf).await?;
Ok(([("content-type", "text/plain")], buf).into_response())
} else {
let post = state.posts.get_post(&name).await?;
let page = ViewPostTemplate {
meta: post.0,
rendered: post.1,
rendered_in: post.2,
markdown_access: state.config.markdown_access,
};
Ok(page.into_response())
}
}
async fn all_posts(State(state): State<ArcState>) -> AppResult<Json<Vec<PostMetadata>>> {
let posts = state.posts.list_posts().await?;
Ok(Json(posts))
}
use crate::app::AppState;
use crate::post::{MarkdownPosts, PostManager};
use crate::templates::new_registry;
use crate::templates::watcher::watch_templates;
#[tokio::main]
async fn main() -> eyre::Result<()> {
#[cfg(feature = "tokio-console")]
console_subscriber::init();
color_eyre::install()?;
#[cfg(not(feature = "tokio-console"))]
tracing_subscriber::registry()
let reg = tracing_subscriber::registry();
#[cfg(feature = "tokio-console")]
let reg = reg
.with(
EnvFilter::builder()
.with_default_directive(LevelFilter::TRACE.into())
.from_env_lossy(),
)
.with(console_subscriber::spawn());
#[cfg(not(feature = "tokio-console"))]
let reg = reg.with(
EnvFilter::builder()
.with_default_directive(LevelFilter::INFO.into())
.from_env_lossy(),
)
.with(tracing_subscriber::fmt::layer())
.init();
);
reg.with(tracing_subscriber::fmt::layer()).init();
let config = config::load()
let config = Arc::new(
config::load()
.await
.context("couldn't load configuration")?;
.context("couldn't load configuration")?,
);
let socket_addr = SocketAddr::new(config.http.host, config.http.port);
let mut tasks = JoinSet::new();
let mut cancellation_tokens = Vec::new();
let cancellation_token = CancellationToken::new();
#[cfg(feature = "precompression")]
if config.precompression.enable {
let span = info_span!("compression");
info!(parent: span.clone(), "compressing static");
let start = Instant::now();
// NOTE: use tokio::task::spawn_blocking if this ever turns into a concurrent task
let mut reg = new_registry(&config.dirs.custom_templates)
.context("failed to create handlebars registry")?;
reg.register_helper("date", Box::new(helpers::date));
reg.register_helper("duration", Box::new(helpers::duration));
debug!(duration = ?start.elapsed(), "registered all templates");
let compressed = tokio::task::spawn_blocking(|| compress_epicly("static"))
.await
.unwrap()
.context("couldn't compress static")?;
let reg = Arc::new(RwLock::new(reg));
let _handle = span.enter();
let watcher_token = cancellation_token.child_token();
if compressed > 0 {
info!(compressed_files=%compressed, "compressed {compressed} files");
}
if config.precompression.watch {
info!("starting compressor task");
let span = span.clone();
let token = CancellationToken::new();
let passed_token = token.clone();
tasks.spawn(async move {
watch(span, passed_token, Default::default())
.await
.context("failed to watch static")
.unwrap()
});
cancellation_tokens.push(token);
}
}
let posts = if let Some(path) = config.cache_file.as_ref()
&& tokio::fs::try_exists(&path)
.await
.with_context(|| format!("failed to check if {} exists", path.display()))?
{
info!("loading cache from file");
let load_cache = async {
let mut cache_file = tokio::fs::File::open(&path)
.await
.context("failed to open cache file")?;
let mut serialized = Vec::with_capacity(4096);
cache_file
.read_to_end(&mut serialized)
.await
.context("failed to read cache file")?;
let cache =
bitcode::deserialize(serialized.as_slice()).context("failed to parse cache")?;
Ok::<PostManager, color_eyre::Report>(PostManager::new_with_cache(
config.posts_dir.clone(),
config.render.clone(),
cache,
))
}
.await;
match load_cache {
Ok(posts) => posts,
Err(err) => {
error!("failed to load cache: {}", err);
info!("using empty cache");
PostManager::new(config.posts_dir.clone(), config.render.clone())
}
}
} else {
PostManager::new(config.posts_dir.clone(), config.render.clone())
let posts = Arc::new(MarkdownPosts::new(Arc::clone(&config)).await?);
let state = AppState {
config: Arc::clone(&config),
posts: Arc::clone(&posts),
reg: Arc::clone(&reg),
};
let state = Arc::new(AppState { config, posts });
let app = Router::new()
.route("/", get(index))
.route(
"/post/:name",
get(
|Path(name): Path<String>| async move { Redirect::to(&format!("/posts/{}", name)) },
),
debug!("setting up watcher");
tasks.spawn(
watch_templates(
config.dirs.custom_templates.clone(),
watcher_token.clone(),
reg,
)
.route("/posts/:name", get(post))
.route("/posts", get(all_posts))
.nest_service("/static", ServeDir::new("static").precompressed_gzip())
.nest_service("/media", ServeDir::new("media"))
.layer(
TraceLayer::new_for_http()
.make_span_with(|request: &Request<_>| {
let matched_path = request
.extensions()
.get::<MatchedPath>()
.map(MatchedPath::as_str);
.instrument(info_span!("custom_template_watcher")),
);
info_span!(
"request",
method = ?request.method(),
path = ?request.uri().path(),
matched_path,
)
})
.on_response(|response: &Response<_>, duration: Duration, span: &Span| {
let _ = span.enter();
let status = response.status();
info!(?status, ?duration, "response");
}),
)
.with_state(state.clone());
if config.cache.enable && config.cache.cleanup {
if let Some(millis) = config.cache.cleanup_interval {
let posts = Arc::clone(&posts);
let token = cancellation_token.child_token();
debug!("setting up cleanup task");
tasks.spawn(async move {
let mut interval = tokio::time::interval(Duration::from_millis(millis));
loop {
select! {
_ = token.cancelled() => break Ok(()),
_ = interval.tick() => {
posts.cleanup().await
}
}
}
});
} else {
posts.cleanup().await;
}
}
let listener = TcpListener::bind((state.config.host, state.config.port))
let app = app::new(&config).with_state(state.clone());
let listener = TcpListener::bind(socket_addr)
.await
.with_context(|| {
format!(
"couldn't listen on {}",
SocketAddr::new(state.config.host, state.config.port)
)
})?;
.with_context(|| format!("couldn't listen on {}", socket_addr))?;
let local_addr = listener
.local_addr()
.context("couldn't get socket address")?;
info!("listening on http://{}", local_addr);
let sigint = signal::ctrl_c();
#[cfg(unix)]
let mut sigterm_handler =
tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())?;
#[cfg(unix)]
let sigterm = sigterm_handler.recv();
#[cfg(not(unix))] // TODO: kill all windows server users
let sigterm = std::future::pending::<()>();
let sigterm = platform::sigterm();
let axum_token = CancellationToken::new();
cancellation_tokens.push(axum_token.clone());
let axum_token = cancellation_token.child_token();
let mut server = axum::serve(
listener,
@ -308,42 +153,21 @@ async fn main() -> eyre::Result<()> {
let cleanup = async move {
// stop tasks
for token in cancellation_tokens {
token.cancel();
}
cancellation_token.cancel();
server.await.context("failed to serve app")?;
while let Some(task) = tasks.join_next().await {
task.context("failed to join task")?;
let res = task.context("failed to join task")?;
if let Err(err) = res {
error!("task failed with error: {err}");
}
}
// write cache to file
let AppState { config, posts } = Arc::<AppState>::try_unwrap(state).unwrap_or_else(|state| {
warn!("couldn't unwrap Arc over AppState, more than one strong reference exists for Arc. cloning instead");
AppState::clone(state.as_ref())
});
if let Some(path) = config.cache_file.as_ref() {
let cache = posts.into_cache();
let mut serialized = bitcode::serialize(&cache).context("failed to serialize cache")?;
let mut cache_file = tokio::fs::File::create(path)
.await
.with_context(|| format!("failed to open cache at {}", path.display()))?;
cache_file
.write_all(serialized.as_mut_slice())
.await
.context("failed to write cache to file")?;
info!("wrote cache to {}", path.display());
}
drop(state);
Ok::<(), color_eyre::Report>(())
};
let sigint = signal::ctrl_c();
#[cfg(unix)]
let mut sigterm_handler =
tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())?;
#[cfg(unix)]
let sigterm = sigterm_handler.recv();
#[cfg(not(unix))]
let sigterm = std::future::pending::<()>();
let sigterm = platform::sigterm();
tokio::select! {
result = cleanup => {

View file

@ -32,7 +32,7 @@ fn build_syntect(config: &RenderConfig) -> Arc<SyntectAdapter> {
Arc::new(builder.build())
}
pub fn render(markdown: &str, config: &RenderConfig, front_matter: bool) -> String {
pub fn render(markdown: &str, config: &RenderConfig) -> String {
let mut options = ComrakOptions::default();
options.extension.table = true;
options.extension.autolink = true;
@ -41,9 +41,6 @@ pub fn render(markdown: &str, config: &RenderConfig, front_matter: bool) -> Stri
options.extension.strikethrough = true;
options.extension.multiline_block_quotes = true;
options.extension.header_ids = Some(String::new());
if front_matter {
options.extension.front_matter_delimiter = Some(String::from("---"));
};
let mut render_plugins = RenderPlugins::default();
let syntect = syntect_adapter(config);

9
src/platform.rs Normal file
View file

@ -0,0 +1,9 @@
pub async fn sigterm() -> Result<Option<()>, std::io::Error> {
#[cfg(unix)]
let mut sigterm_handler =
tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())?;
#[cfg(unix)]
return Ok(sigterm_handler.recv().await);
#[cfg(not(unix))]
std::future::pending::<None>().await
}

View file

@ -1,12 +1,16 @@
use std::hash::{DefaultHasher, Hash, Hasher};
use std::io::Read;
use scc::HashMap;
use serde::de::Visitor;
use serde::ser::SerializeMap;
use serde::{Deserialize, Deserializer, Serialize, Serializer};
use crate::config::RenderConfig;
use crate::config::{Config, RenderConfig};
use crate::post::PostMetadata;
use color_eyre::eyre::{self, Context};
use scc::HashMap;
use serde::{Deserialize, Serialize};
use tokio::io::AsyncReadExt;
use tracing::{debug, instrument};
/// do not persist cache if this version number changed
pub const CACHE_VERSION: u16 = 2;
#[derive(Serialize, Deserialize, Clone)]
pub struct CacheValue {
@ -16,14 +20,16 @@ pub struct CacheValue {
config_hash: u64,
}
#[derive(Default, Clone)]
pub struct Cache(HashMap<String, CacheValue>);
#[derive(Serialize, Deserialize, Clone)]
pub struct Cache(HashMap<String, CacheValue>, u16);
impl Cache {
pub fn from_map(cache: HashMap<String, CacheValue>) -> Self {
Self(cache)
impl Default for Cache {
fn default() -> Self {
Self(Default::default(), CACHE_VERSION)
}
}
impl Cache {
pub async fn lookup(
&self,
name: &str,
@ -102,58 +108,56 @@ impl Cache {
self.0.remove_async(name).await
}
#[inline(always)]
pub fn into_inner(self) -> HashMap<String, CacheValue> {
#[instrument(name = "cleanup", skip_all)]
pub async fn cleanup(&self, get_mtime: impl Fn(&str) -> Option<u64>) {
let old_size = self.0.len();
let mut i = 0;
// TODO: multithread
self.0
.retain_async(|k, v| {
if get_mtime(k).is_some_and(|mtime| mtime == v.mtime) {
true
} else {
debug!("removing {k} from cache");
i += 1;
false
}
})
.await;
let new_size = self.0.len();
debug!("removed {i} entries ({old_size} -> {new_size} entries)");
}
#[inline(always)]
pub fn version(&self) -> u16 {
self.1
}
}
impl Serialize for Cache {
fn serialize<S>(&self, serializer: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
let cache = self.clone().into_inner();
let mut map = serializer.serialize_map(Some(cache.len()))?;
let mut entry = cache.first_entry();
while let Some(occupied) = entry {
map.serialize_entry(occupied.key(), occupied.get())?;
entry = occupied.next();
}
map.end()
}
}
impl<'de> Deserialize<'de> for Cache {
fn deserialize<D>(deserializer: D) -> Result<Self, D::Error>
where
D: Deserializer<'de>,
{
struct CoolVisitor;
impl<'de> Visitor<'de> for CoolVisitor {
type Value = Cache;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "expected a map")
}
fn visit_map<A>(self, mut map: A) -> Result<Self::Value, A::Error>
where
A: serde::de::MapAccess<'de>,
{
let cache = match map.size_hint() {
Some(size) => HashMap::with_capacity(size),
None => HashMap::new(),
pub(crate) async fn load_cache(config: &Config) -> Result<Cache, eyre::Report> {
let path = &config.cache.file;
let mut cache_file = tokio::fs::File::open(&path)
.await
.context("failed to open cache file")?;
let serialized = if config.cache.compress {
let cache_file = cache_file.into_std().await;
tokio::task::spawn_blocking(move || {
let mut buf = Vec::with_capacity(4096);
zstd::stream::read::Decoder::new(cache_file)?.read_to_end(&mut buf)?;
Ok::<_, std::io::Error>(buf)
})
.await?
.context("failed to read cache file")?
} else {
let mut buf = Vec::with_capacity(4096);
cache_file
.read_to_end(&mut buf)
.await
.context("failed to read cache file")?;
buf
};
while let Some((key, value)) = map.next_entry::<String, CacheValue>()? {
cache.insert(key, value).ok();
}
Ok(Cache::from_map(cache))
}
}
deserializer.deserialize_map(CoolVisitor)
}
bitcode::deserialize(serialized.as_slice()).context("failed to parse cache")
}

348
src/post/markdown_posts.rs Normal file
View file

@ -0,0 +1,348 @@
use std::collections::BTreeSet;
use std::io::{self, Write};
use std::ops::Deref;
use std::path::Path;
use std::time::Duration;
use std::time::Instant;
use std::time::SystemTime;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use color_eyre::eyre::{self, Context};
use fronma::parser::{parse, ParsedData};
use serde::Deserialize;
use tokio::fs;
use tokio::io::AsyncReadExt;
use tracing::{error, info, warn};
use crate::config::Config;
use crate::markdown_render::render;
use crate::post::cache::{load_cache, Cache, CACHE_VERSION};
use crate::post::{PostError, PostManager, PostMetadata, RenderStats, ReturnedPost};
use crate::systemtime_as_secs::as_secs;
#[derive(Deserialize)]
struct FrontMatter {
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub icon_alt: Option<String>,
pub color: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
#[serde(default)]
pub tags: BTreeSet<String>,
}
impl FrontMatter {
pub fn into_full(
self,
name: String,
created: Option<SystemTime>,
modified: Option<SystemTime>,
) -> PostMetadata {
PostMetadata {
name,
title: self.title,
description: self.description,
author: self.author,
icon: self.icon,
icon_alt: self.icon_alt,
color: self.color,
created_at: self.created_at.or_else(|| created.map(|t| t.into())),
modified_at: self.modified_at.or_else(|| modified.map(|t| t.into())),
tags: self.tags.into_iter().collect(),
}
}
}
pub struct MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
cache: Option<Cache>,
config: C,
}
impl<C> MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
pub async fn new(config: C) -> eyre::Result<MarkdownPosts<C>> {
if config.cache.enable {
if config.cache.persistence && tokio::fs::try_exists(&config.cache.file).await? {
info!("loading cache from file");
let mut cache = load_cache(&config).await.unwrap_or_else(|err| {
error!("failed to load cache: {}", err);
info!("using empty cache");
Default::default()
});
if cache.version() < CACHE_VERSION {
warn!("cache version changed, clearing cache");
cache = Default::default();
};
Ok(Self {
cache: Some(cache),
config,
})
} else {
Ok(Self {
cache: Some(Default::default()),
config,
})
}
} else {
Ok(Self {
cache: None,
config,
})
}
}
async fn parse_and_render(
&self,
name: String,
path: impl AsRef<Path>,
) -> Result<(PostMetadata, String, (Duration, Duration)), PostError> {
let parsing_start = Instant::now();
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(val) => val,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => return Err(PostError::NotFound(name)),
_ => return Err(PostError::IoError(err)),
},
};
let stat = file.metadata().await?;
let modified = stat.modified()?;
let created = stat.created().ok();
let mut content = String::with_capacity(stat.len() as usize);
file.read_to_string(&mut content).await?;
let ParsedData { headers, body } = parse::<FrontMatter>(&content)?;
let metadata = headers.into_full(name.to_owned(), created, Some(modified));
let parsing = parsing_start.elapsed();
let before_render = Instant::now();
let post = render(body, &self.config.render);
let rendering = before_render.elapsed();
if let Some(cache) = self.cache.as_ref() {
cache
.insert(
name.to_string(),
metadata.clone(),
as_secs(&modified),
post.clone(),
&self.config.render,
)
.await
.unwrap_or_else(|err| warn!("failed to insert {:?} into cache", err.0))
};
Ok((metadata, post, (parsing, rendering)))
}
fn cache(&self) -> Option<&Cache> {
self.cache.as_ref()
}
fn try_drop(&mut self) -> Result<(), eyre::Report> {
// write cache to file
let config = &self.config.cache;
if config.enable
&& config.persistence
&& let Some(cache) = self.cache()
{
let path = &config.file;
let serialized = bitcode::serialize(cache).context("failed to serialize cache")?;
let mut cache_file = std::fs::File::create(path)
.with_context(|| format!("failed to open cache at {}", path.display()))?;
let compression_level = config.compression_level;
if config.compress {
std::io::Write::write_all(
&mut zstd::stream::write::Encoder::new(cache_file, compression_level)?
.auto_finish(),
&serialized,
)
} else {
cache_file.write_all(&serialized)
}
.context("failed to write cache to file")?;
info!("wrote cache to {}", path.display());
}
Ok(())
}
}
impl<C> Drop for MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
fn drop(&mut self) {
self.try_drop().unwrap()
}
}
impl<C> PostManager for MarkdownPosts<C>
where
C: Deref<Target = Config>,
{
async fn get_all_post_metadata(
&self,
filter: impl Fn(&PostMetadata) -> bool,
) -> Result<Vec<PostMetadata>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.dirs.posts).await?;
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let mtime = as_secs(&stat.modified()?);
// TODO. this?
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
if let Some(cache) = self.cache.as_ref()
&& let Some(hit) = cache.lookup_metadata(&name, mtime).await
&& filter(&hit)
{
posts.push(hit);
} else {
match self.parse_and_render(name, path).await {
Ok((metadata, ..)) => {
if filter(&metadata) {
posts.push(metadata);
}
}
Err(err) => match err {
PostError::IoError(ref io_err)
if matches!(io_err.kind(), io::ErrorKind::NotFound) =>
{
warn!("TOCTOU: {}", err)
}
_ => return Err(err),
},
}
}
}
}
Ok(posts)
}
async fn get_all_posts(
&self,
filter: impl Fn(&PostMetadata, &str) -> bool,
) -> Result<Vec<(PostMetadata, String, RenderStats)>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(&self.config.dirs.posts).await?;
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
let post = self.get_post(&name).await?;
if let ReturnedPost::Rendered(meta, content, stats) = post
&& filter(&meta, &content)
{
posts.push((meta, content, stats));
}
}
}
Ok(posts)
}
async fn get_post(&self, name: &str) -> Result<ReturnedPost, PostError> {
if self.config.markdown_access && name.ends_with(".md") {
let path = self.config.dirs.posts.join(name);
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(value) => value,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
if let Some(cache) = self.cache.as_ref() {
cache.remove(name).await;
}
return Err(PostError::NotFound(name.to_string()));
}
_ => return Err(PostError::IoError(err)),
},
};
let mut buf = Vec::with_capacity(4096);
file.read_to_end(&mut buf).await?;
Ok(ReturnedPost::Raw(
buf,
HeaderValue::from_static("text/plain"),
))
} else {
let start = Instant::now();
let path = self.config.dirs.posts.join(name.to_owned() + ".md");
let stat = match tokio::fs::metadata(&path).await {
Ok(value) => value,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
if let Some(cache) = self.cache.as_ref() {
cache.remove(name).await;
}
return Err(PostError::NotFound(name.to_string()));
}
_ => return Err(PostError::IoError(err)),
},
};
let mtime = as_secs(&stat.modified()?);
if let Some(cache) = self.cache.as_ref()
&& let Some(hit) = cache.lookup(name, mtime, &self.config.render).await
{
Ok(ReturnedPost::Rendered(
hit.metadata,
hit.rendered,
RenderStats::Cached(start.elapsed()),
))
} else {
let (metadata, rendered, stats) =
self.parse_and_render(name.to_string(), path).await?;
Ok(ReturnedPost::Rendered(
metadata,
rendered,
RenderStats::ParsedAndRendered(start.elapsed(), stats.0, stats.1),
))
}
}
}
async fn cleanup(&self) {
if let Some(cache) = self.cache.as_ref() {
cache
.cleanup(|name| {
std::fs::metadata(self.config.dirs.posts.join(name.to_owned() + ".md"))
.ok()
.and_then(|metadata| metadata.modified().ok())
.map(|mtime| as_secs(&mtime))
})
.await
}
}
}

View file

@ -1,228 +1,85 @@
mod cache;
pub mod cache;
pub mod markdown_posts;
use std::io;
use std::path::{Path, PathBuf};
use std::time::{Duration, Instant, SystemTime};
use std::time::Duration;
use askama::Template;
use axum::http::HeaderValue;
use chrono::{DateTime, Utc};
use fronma::parser::{parse, ParsedData};
use serde::{Deserialize, Serialize};
use tokio::fs;
use tokio::io::AsyncReadExt;
use tracing::warn;
use crate::config::RenderConfig;
use crate::markdown_render::render;
use crate::post::cache::Cache;
use crate::PostError;
use crate::error::PostError;
pub use crate::post::markdown_posts::MarkdownPosts;
#[derive(Deserialize)]
struct FrontMatter {
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
}
impl FrontMatter {
pub fn into_full(
self,
name: String,
created: Option<SystemTime>,
modified: Option<SystemTime>,
) -> PostMetadata {
PostMetadata {
name,
title: self.title,
description: self.description,
author: self.author,
icon: self.icon,
created_at: self.created_at.or_else(|| created.map(|t| t.into())),
modified_at: self.modified_at.or_else(|| modified.map(|t| t.into())),
}
}
}
#[derive(Serialize, Deserialize, Clone)]
#[derive(Serialize, Deserialize, Clone, Debug)]
pub struct PostMetadata {
pub name: String,
pub title: String,
pub description: String,
pub author: String,
pub icon: Option<String>,
pub icon_alt: Option<String>,
pub color: Option<String>,
pub created_at: Option<DateTime<Utc>>,
pub modified_at: Option<DateTime<Utc>>,
pub tags: Vec<String>,
}
use crate::filters;
#[derive(Template)]
#[template(path = "post.html")]
struct Post<'a> {
pub meta: &'a PostMetadata,
pub rendered_markdown: String,
}
// format: TOTAL OP1 OP2
#[allow(unused)]
#[derive(Serialize)]
pub enum RenderStats {
Cached(Duration),
// format: Total, Parsed in, Rendered in
ParsedAndRendered(Duration, Duration, Duration),
}
#[derive(Clone)]
pub struct PostManager {
dir: PathBuf,
cache: Cache,
config: RenderConfig,
#[allow(clippy::large_enum_variant)] // Raw will be returned very rarely
pub enum ReturnedPost {
Rendered(PostMetadata, String, RenderStats),
Raw(Vec<u8>, HeaderValue),
}
impl PostManager {
pub fn new(dir: PathBuf, config: RenderConfig) -> PostManager {
PostManager {
dir,
cache: Default::default(),
config,
}
}
pub fn new_with_cache(dir: PathBuf, config: RenderConfig, cache: Cache) -> PostManager {
PostManager { dir, cache, config }
}
async fn parse_and_render(
pub trait PostManager {
async fn get_all_post_metadata(
&self,
name: String,
path: impl AsRef<Path>,
) -> Result<(PostMetadata, String, (Duration, Duration)), PostError> {
let parsing_start = Instant::now();
let mut file = match tokio::fs::OpenOptions::new().read(true).open(&path).await {
Ok(val) => val,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => return Err(PostError::NotFound(name)),
_ => return Err(PostError::IoError(err)),
},
};
let stat = file.metadata().await?;
let modified = stat.modified()?;
let created = stat.created().ok();
let mut content = String::with_capacity(stat.len() as usize);
file.read_to_string(&mut content).await?;
let ParsedData { headers, body } = parse::<FrontMatter>(&content)?;
let metadata = headers.into_full(name.to_owned(), created, Some(modified));
let parsing = parsing_start.elapsed();
let before_render = Instant::now();
let rendered_markdown = render(body, &self.config, false);
let post = Post {
meta: &metadata,
rendered_markdown,
}
.render()?;
let rendering = before_render.elapsed();
self.cache
.insert(
name.to_string(),
metadata.clone(),
modified
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap()
.as_secs(),
post.clone(),
&self.config,
)
.await
.unwrap_or_else(|err| warn!("failed to insert {:?} into cache", err.0));
Ok((metadata, post, (parsing, rendering)))
}
async fn list_posts_recursive(
&self,
dir: impl AsRef<Path>,
filter: impl Fn(&PostMetadata) -> bool,
) -> Result<Vec<PostMetadata>, PostError> {
let mut posts = Vec::new();
let mut read_dir = fs::read_dir(dir).await?;
while let Some(entry) = read_dir.next_entry().await? {
let path = entry.path();
let stat = fs::metadata(&path).await?;
if stat.is_file() && path.extension().is_some_and(|ext| ext == "md") {
let mtime = stat
.modified()?
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap()
.as_secs();
let name = path
.clone()
.file_stem()
.unwrap()
.to_string_lossy()
.to_string();
if let Some(hit) = self.cache.lookup_metadata(&name, mtime).await {
posts.push(hit)
} else if let Ok((metadata, ..)) = self.parse_and_render(name, path).await {
posts.push(metadata);
}
self.get_all_posts(|m, _| filter(m))
.await
.map(|vec| vec.into_iter().map(|(meta, ..)| meta).collect())
}
async fn get_all_posts(
&self,
filter: impl Fn(&PostMetadata, &str) -> bool,
) -> Result<Vec<(PostMetadata, String, RenderStats)>, PostError>;
async fn get_max_n_post_metadata_with_optional_tag_sorted(
&self,
n: Option<usize>,
tag: Option<&String>,
) -> Result<Vec<PostMetadata>, PostError> {
let mut posts = self
.get_all_post_metadata(|metadata| !tag.is_some_and(|tag| !metadata.tags.contains(tag)))
.await?;
// we still want some semblance of order if created_at is None so sort by mtime as well
posts.sort_unstable_by_key(|metadata| metadata.modified_at.unwrap_or_default());
posts.sort_by_key(|metadata| metadata.created_at.unwrap_or_default());
posts.reverse();
if let Some(n) = n {
posts.truncate(n);
}
Ok(posts)
}
#[allow(unused)]
pub async fn list_posts(&self) -> Result<Vec<PostMetadata>, PostError> {
self.list_posts_recursive(&self.dir).await
}
// third entry in the tuple is whether it got rendered and if so, how long did it take
pub async fn get_post(
&self,
name: &str,
) -> Result<(PostMetadata, String, RenderStats), PostError> {
let start = Instant::now();
let path = self.dir.join(name.to_owned() + ".md");
let stat = match tokio::fs::metadata(&path).await {
Ok(value) => value,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => {
self.cache.remove(name).await;
return Err(PostError::NotFound(name.to_string()));
}
_ => return Err(PostError::IoError(err)),
},
};
let mtime = stat
.modified()?
.duration_since(SystemTime::UNIX_EPOCH)
.unwrap()
.as_secs();
if let Some(hit) = self.cache.lookup(name, mtime, &self.config).await {
Ok((
hit.metadata,
hit.rendered,
RenderStats::Cached(start.elapsed()),
))
} else {
let (metadata, rendered, stats) = self.parse_and_render(name.to_string(), path).await?;
Ok((
metadata,
rendered,
RenderStats::ParsedAndRendered(start.elapsed(), stats.0, stats.1),
))
async fn get_post_metadata(&self, name: &str) -> Result<PostMetadata, PostError> {
match self.get_post(name).await? {
ReturnedPost::Rendered(metadata, ..) => Ok(metadata),
ReturnedPost::Raw(..) => Err(PostError::NotFound(name.to_string())),
}
}
pub fn into_cache(self) -> Cache {
self.cache
}
async fn get_post(&self, name: &str) -> Result<ReturnedPost, PostError>;
async fn cleanup(&self);
}

View file

@ -0,0 +1,37 @@
pub struct RangedI128Visitor<const START: i128, const END: i128>;
impl<'de, const START: i128, const END: i128> serde::de::Visitor<'de>
for RangedI128Visitor<START, END>
{
type Value = i128;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(formatter, "an integer between {START} and {END}")
}
fn visit_i32<E>(self, v: i32) -> std::result::Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i128(v as i128)
}
fn visit_i64<E>(self, v: i64) -> std::prelude::v1::Result<Self::Value, E>
where
E: serde::de::Error,
{
self.visit_i128(v as i128)
}
fn visit_i128<E>(self, v: i128) -> std::prelude::v1::Result<Self::Value, E>
where
E: serde::de::Error,
{
if v >= START && v <= END {
Ok(v)
} else {
Err(E::custom(format!(
"integer is out of bounds ({START}..{END})"
)))
}
}
}

81
src/serve_dir_included.rs Normal file
View file

@ -0,0 +1,81 @@
use std::convert::Infallible;
use std::str::pattern::Pattern;
use axum::extract::Request;
use axum::http::{header, StatusCode};
use axum::response::{IntoResponse, Response};
use include_dir::{Dir, DirEntry};
use tracing::{debug, trace};
fn if_empty<'a>(a: &'a str, b: &'a str) -> &'a str {
if a.is_empty() {
b
} else {
a
}
}
fn remove_prefixes(mut src: &str, pat: (impl Pattern + Copy)) -> &str {
while let Some(removed) = src.strip_prefix(pat) {
src = removed;
}
src
}
fn from_included_file(file: &'static include_dir::File<'static>) -> Response {
let mime_type = mime_guess::from_path(file.path()).first_or_octet_stream();
(
[(
header::CONTENT_TYPE,
header::HeaderValue::try_from(mime_type.essence_str()).expect("invalid mime type"),
)],
file.contents(),
)
.into_response()
}
pub async fn handle(
req: Request,
included_dir: &'static Dir<'static>,
) -> Result<Response, Infallible> {
#[cfg(windows)]
compile_error!("this is not safe");
let path = req.uri().path();
let has_dotdot = path.split('/').any(|seg| seg == "..");
if has_dotdot {
return Ok(StatusCode::NOT_FOUND.into_response());
}
let relative_path = if_empty(remove_prefixes(path, '/'), ".");
match included_dir.get_entry(relative_path) {
Some(DirEntry::Dir(dir)) => {
trace!("{relative_path:?} is a directory, trying \"index.html\"");
if let Some(file) = dir.get_file("index.html") {
debug!("{path:?} (index.html) serving from included dir");
return Ok(from_included_file(file));
} else {
trace!("\"index.html\" not found in {relative_path:?} in included files");
}
}
None if relative_path == "." => {
trace!("requested root, trying \"index.html\"");
if let Some(file) = included_dir.get_file("index.html") {
debug!("{path:?} (index.html) serving from included dir");
return Ok(from_included_file(file));
} else {
trace!("\"index.html\" not found in included files");
}
}
Some(DirEntry::File(file)) => {
debug!("{path:?} serving from included dir");
return Ok(from_included_file(file));
}
None => trace!("{relative_path:?} not found in included files"),
};
Ok(StatusCode::NOT_FOUND.into_response())
}

View file

@ -0,0 +1,9 @@
use std::time::SystemTime;
pub fn as_secs(t: &SystemTime) -> u64 {
match t.duration_since(SystemTime::UNIX_EPOCH) {
Ok(duration) => duration,
Err(err) => err.duration(),
}
.as_secs()
}

186
src/templates/mod.rs Normal file
View file

@ -0,0 +1,186 @@
pub mod watcher;
use std::{io, path::Path};
use handlebars::{Handlebars, Template};
use include_dir::{include_dir, Dir};
use thiserror::Error;
use tracing::{debug, error, info_span, trace};
const TEMPLATES: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/templates");
const PARTIALS: Dir<'static> = include_dir!("$CARGO_MANIFEST_DIR/partials");
#[derive(Error, Debug)]
#[allow(clippy::enum_variant_names)]
pub enum TemplateError {
#[error(transparent)]
IoError(#[from] std::io::Error),
#[error("file doesn't contain valid UTF-8")]
UTF8Error,
#[error(transparent)]
TemplateError(#[from] handlebars::TemplateError),
}
fn is_ext(path: impl AsRef<Path>, ext: &str) -> bool {
match path.as_ref().extension() {
Some(path_ext) if path_ext != ext => false,
None => false,
_ => true,
}
}
fn get_template_name(path: &Path) -> Option<&str> {
if !is_ext(path, "hbs") {
return None;
}
path.file_stem()?.to_str()
}
fn register_included_file(
file: &include_dir::File<'_>,
name: &str,
registry: &mut Handlebars,
) -> Result<(), TemplateError> {
let template = compile_included_file(file)?;
registry.register_template(name, template);
Ok(())
}
fn register_path(
path: impl AsRef<std::path::Path>,
name: &str,
registry: &mut Handlebars<'_>,
) -> Result<(), TemplateError> {
let template = compile_path(path)?;
registry.register_template(name, template);
Ok(())
}
fn register_partial(
file: &include_dir::File<'_>,
name: &str,
registry: &mut Handlebars,
) -> Result<(), TemplateError> {
registry.register_partial(name, file.contents_utf8().ok_or(TemplateError::UTF8Error)?)?;
Ok(())
}
fn compile_included_file(file: &include_dir::File<'_>) -> Result<Template, TemplateError> {
let contents = file.contents_utf8().ok_or(TemplateError::UTF8Error)?;
let template = Template::compile(contents)?;
Ok(template)
}
fn compile_path(path: impl AsRef<std::path::Path>) -> Result<Template, TemplateError> {
use std::fs::OpenOptions;
use std::io::Read;
let mut file = OpenOptions::new().read(true).open(path)?;
let mut buf = String::new();
file.read_to_string(&mut buf)?;
let template = Template::compile(&buf)?;
Ok(template)
}
async fn compile_path_async_io(
path: impl AsRef<std::path::Path>,
) -> Result<Template, TemplateError> {
use tokio::fs::OpenOptions;
use tokio::io::AsyncReadExt;
let mut file = OpenOptions::new().read(true).open(path).await?;
let mut buf = String::new();
file.read_to_string(&mut buf).await?;
let template = Template::compile(&buf)?;
Ok(template)
}
pub fn new_registry<'a>(custom_templates_path: impl AsRef<Path>) -> io::Result<Handlebars<'a>> {
let mut reg = Handlebars::new();
for entry in TEMPLATES.entries() {
let file = match entry.as_file() {
Some(file) => file,
None => continue,
};
let span = info_span!("register_included_template", path = ?file.path());
let _handle = span.enter();
let name = match get_template_name(file.path()) {
Some(v) => v,
None => {
trace!("skipping file");
continue;
}
};
match register_included_file(file, name, &mut reg) {
Ok(()) => debug!("registered template {name:?}"),
Err(err) => error!("error while registering template: {err}"),
};
}
for entry in PARTIALS.entries() {
let file = match entry.as_file() {
Some(file) => file,
None => continue,
};
let span = info_span!("register_partial", path = ?file.path());
let _handle = span.enter();
let name = match get_template_name(file.path()) {
Some(v) => v,
None => {
trace!("skipping file");
continue;
}
};
match register_partial(file, name, &mut reg) {
Ok(()) => debug!("registered partial {name:?}"),
Err(err) => error!("error while registering partial: {err}"),
};
}
let read_dir = match std::fs::read_dir(custom_templates_path) {
Ok(v) => v,
Err(err) => match err.kind() {
io::ErrorKind::NotFound => return Ok(reg),
_ => panic!("{:?}", err),
},
};
for entry in read_dir {
let entry = entry.unwrap();
let file_type = entry.file_type()?;
if !file_type.is_file() {
continue;
}
let path = entry.path();
let span = info_span!("register_custom_template", ?path);
let _handle = span.enter();
let name = match get_template_name(&path) {
Some(v) => v,
None => {
trace!("skipping file");
continue;
}
};
match register_path(&path, name, &mut reg) {
Ok(()) => debug!("registered template {name:?}"),
Err(err) => error!("error while registering template: {err}"),
};
}
Ok(reg)
}

126
src/templates/watcher.rs Normal file
View file

@ -0,0 +1,126 @@
use std::path::Path;
use std::sync::Arc;
use std::time::Duration;
use handlebars::{Handlebars, Template};
use notify_debouncer_full::notify::{self, Watcher};
use notify_debouncer_full::{new_debouncer, DebouncedEvent};
use tokio::select;
use tokio::sync::RwLock;
use tokio_util::sync::CancellationToken;
use tracing::{debug, error, info, trace, trace_span};
use crate::templates::*;
async fn process_event(
event: DebouncedEvent,
templates: &mut Vec<(String, Template)>,
) -> Result<(), Box<dyn std::error::Error>> {
match event.kind {
notify::EventKind::Create(notify::event::CreateKind::File)
| notify::EventKind::Modify(_) => {
for path in &event.paths {
let span = trace_span!("modify_event", ?path);
let _handle = span.enter();
let template_name = match get_template_name(path) {
Some(v) => v,
None => {
trace!("skipping event");
continue;
}
};
trace!("processing recompilation");
let compiled = compile_path_async_io(path).await?;
trace!("compiled template {template_name:?}");
templates.push((template_name.to_owned(), compiled));
}
}
notify::EventKind::Remove(notify::event::RemoveKind::File) => {
for path in &event.paths {
let span = trace_span!("remove_event", ?path);
let _handle = span.enter();
let (file_name, template_name) = match path
.file_name()
.and_then(|o| o.to_str())
.and_then(|file_name| {
get_template_name(Path::new(file_name))
.map(|template_name| (file_name, template_name))
}) {
Some(v) => v,
None => {
trace!("skipping event");
continue;
}
};
trace!("processing removal");
let file = TEMPLATES.get_file(file_name);
if let Some(file) = file {
let compiled = compile_included_file(file)?;
trace!("compiled template {template_name:?}");
templates.push((template_name.to_owned(), compiled));
}
}
}
_ => {}
};
Ok(())
}
pub async fn watch_templates<'a>(
path: impl AsRef<Path>,
watcher_token: CancellationToken,
reg: Arc<RwLock<Handlebars<'a>>>,
) -> Result<(), Box<dyn std::error::Error + Send + Sync + 'static>> {
let path = path.as_ref();
let (tx, mut rx) = tokio::sync::mpsc::channel(1);
let mut debouncer = new_debouncer(Duration::from_millis(100), None, move |events| {
tx.blocking_send(events)
.expect("failed to send message over channel")
})?;
debouncer
.watcher()
.watch(path, notify::RecursiveMode::NonRecursive)?;
'event_loop: while let Some(events) = select! {
_ = watcher_token.cancelled() => {
debug!("exiting watcher loop");
break 'event_loop;
},
events = rx.recv() => events
} {
let events = match events {
Ok(events) => events,
Err(err) => {
error!("error getting events: {err:?}");
continue;
}
};
let mut templates = Vec::new();
for event in events {
trace!("file event: {event:?}");
if let Err(err) = process_event(event, &mut templates).await {
error!("error while processing event: {err}");
}
}
let mut reg = reg.write().await;
for template in templates.into_iter() {
debug!("registered template {}", template.0);
reg.register_template(&template.0, template.1);
}
drop(reg);
info!("updated custom templates");
}
Ok(())
}

View file

@ -1,76 +0,0 @@
use notify::{event::RemoveKind, Config, EventKind, RecommendedWatcher, RecursiveMode, Watcher};
use tokio_util::sync::CancellationToken;
use tracing::{info, Span};
use crate::append_path::Append;
use crate::compress::compress_epicly;
pub async fn watch(
span: Span,
token: CancellationToken,
config: Config,
) -> Result<(), notify::Error> {
let (tx, mut rx) = tokio::sync::mpsc::channel(12);
let mut watcher = RecommendedWatcher::new(
move |res| {
tx.blocking_send(res)
.expect("failed to send message over channel")
},
config,
)?;
watcher.watch(std::path::Path::new("static"), RecursiveMode::Recursive)?;
while let Some(received) = tokio::select! {
received = rx.recv() => received,
_ = token.cancelled() => return Ok(())
} {
match received {
Ok(event) => {
if event.kind.is_create() || event.kind.is_modify() {
let cloned_span = span.clone();
let compressed =
tokio::task::spawn_blocking(move || -> std::io::Result<u64> {
let _handle = cloned_span.enter();
let mut i = 0;
for path in event.paths {
if path.extension().is_some_and(|ext| ext == "gz") {
continue;
}
info!("{} changed, compressing", path.display());
i += compress_epicly(&path)?;
}
Ok(i)
})
.await
.unwrap()?;
if compressed > 0 {
let _handle = span.enter();
info!(compressed_files=%compressed, "compressed {compressed} files");
}
} else if let EventKind::Remove(remove_event) = event.kind // UNSTABLE
&& matches!(remove_event, RemoveKind::File)
{
for path in event.paths {
if path.extension().is_some_and(|ext| ext == "gz") {
continue;
}
let gz_path = path.clone().append(".gz");
if tokio::fs::try_exists(&gz_path).await? {
info!(
"{} removed, also removing {}",
path.display(),
gz_path.display()
);
tokio::fs::remove_file(&gz_path).await?
}
}
}
}
Err(err) => return Err(err),
}
}
Ok(())
}

7
static/date.js Normal file
View file

@ -0,0 +1,7 @@
function replaceDates() {
for (let el of document.querySelectorAll(".date-rfc3339")) {
let date = new Date(Date.parse(el.textContent));
el.textContent = date.toLocaleString();
el.classList.remove("date-rfc3339");
}
}

12
static/main.js Normal file
View file

@ -0,0 +1,12 @@
replaceDates();
let form = document.getElementById("sort");
if (form) {
form.style.display = "block";
let postsByDate = document.getElementById("posts");
let postsByName = document.createElement("div");
populateByName(postsByDate, postsByName);
postsByDate.parentNode.appendChild(postsByName);
handleSort(form, postsByDate, postsByName);
sort(form.sort.value, postsByDate, postsByName);
}

View file

@ -26,8 +26,8 @@ pre > code {
padding: 1.25em 1.5em;
display: block;
background-color: var(--base);
color: var(--text);
background-color: unset;
color: unset;
}
img {
@ -54,3 +54,22 @@ th,
td:nth-child(1) {
word-break: keep-all;
}
blockquote {
margin-left: 1em;
padding-left: 1.5em;
border-left: 0.5em solid;
border-color: var(--blue);
& > blockquote {
border-color: var(--mauve);
& > blockquote {
border-color: var(--pink);
& > blockquote {
border-color: var(--rosewater);
& > blockquote {
border-color: var(--text);
}
}
}
}
}

32
static/sort.js Normal file
View file

@ -0,0 +1,32 @@
function populateByName(source, target) {
let posts = [];
for (let post of source.children) {
let title = post.firstElementChild.innerText;
posts.push([title, post.cloneNode(true)]);
}
posts.sort(([a, _1], [b, _2]) => a.toLocaleLowerCase().localeCompare(b.toLocaleLowerCase()));
for (let [_, post] of posts) {
target.appendChild(post);
}
}
function sort(by, dateEl, nameEl) {
console.log("sorting by", by);
switch (by) {
case "date":
dateEl.style.display = "block";
nameEl.style.display = "none";
break;
case "name":
nameEl.style.display = "block";
dateEl.style.display = "none";
break;
}
}
function handleSort(form, dateEl, nameEl) {
for (let el of form.sort)
el.addEventListener("change", () => {
if (el.checked) sort(el.value, dateEl, nameEl);
});
}

View file

@ -1,4 +1,5 @@
/* colors */
/* colors from catppuccin https://github.com/catppuccin/catppuccin
licensed under the MIT license, available in the source tree */
:root {
--base: #1e1e2e;
--text: #cdd6f4;
@ -30,8 +31,7 @@
:root,
code {
/* please have one at least one good monospace font */
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono",
"JetBrainsMono Nerd Font", "Ubuntu Mono", monospace, sans-serif;
font-family: "Hack", "Hack Nerd Font", "JetBrains Mono", "JetBrainsMono Nerd Font", "Ubuntu Mono", monospace, sans-serif;
}
:root {
@ -81,6 +81,43 @@ footer {
opacity: 0.65;
}
div.post {
margin-bottom: 1em;
}
.table {
display: grid;
/*grid-template-columns: auto auto auto;
grid-template-rows: auto auto;*/
width: max-content;
}
.table > :not(.value)::after {
content: ":";
}
.table > .value {
margin-left: 1em;
grid-column: 2;
}
.table > .created {
grid-row: 1;
}
.table > .modified {
grid-row: 2;
}
.table > .tags {
grid-row: 3;
}
#sort {
display: inline-block;
margin-bottom: 1rem;
}
/* BEGIN cool effect everyone liked */
body {

63
templates/index.hbs Normal file
View file

@ -0,0 +1,63 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="{{title}}" />
<meta property="og:title" content="{{title}}" />
<meta property="og:description" content="{{description}}" />
<meta name="keywords" content="{{joined_tags}}" />
{{#if (ne color null)}}
<meta name="theme-color" content="{{style.color}}" />
{{/if}}
<title>{{title}}</title>
<link rel="stylesheet" href="/static/style.css" />
{{#if rss}}
<link rel="alternate" type="application/rss+xml" title="{{title}}" href="/feed.xml" />
{{/if}}
{{#if js}}
<script src="/static/date.js" defer></script>
<script src="/static/sort.js" defer></script>
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body>
<main>
<h1>{{title}}</h1>
<p>{{description}}</p>
<h2>posts</h2>
<div>
{{#if js}}
<form id="sort" style="display: none">
sort by: {{sort}}
<br />
<input type="radio" name="sort" id="sort-date" value="date" {{#if (eq style.default_sort "date")}}checked{{/if}} />
<label for="sort-date">date</label>
<input type="radio" name="sort" id="sort-name" value="name" {{#if (eq style.default_sort "name")}}checked{{/if}} />
<label for="sort-name">name</label>
</form>
{{/if}}
{{#each posts}}
<div id="posts">
<div class="post">
<a href="/posts/{{name}}"><b>{{title}}</b></a>
<span class="post-author">- by {{author}}</span>
<br />
{{description}}<br />
{{>post_table post style=@root.style}}
</div>
</div>
{{else}} there are no posts right now. check back later! {{/each}}
</div>
{{#if (gt (len tags) 0)}}
<h2>tags</h2>
<b><a href="/">clear tags</a></b>
<br />
{{/if}}
{{#each tags}}
<a href="/?tag={{@key}}" title="view all posts with this tag">{{@key}}</a>
<span class="post-author">- {{this}} post{{#if (ne this 1)}}s{{/if}}</span><br />
{{/each}}
</main>
</body>
</html>

View file

@ -1,36 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="description" content="{{ title }}" />
<meta property="og:title" content="{{ title }}" />
<meta property="og:description" content="{{ description }}" />
<title>{{ title }}</title>
<link rel="stylesheet" href="/static/style.css" />
</head>
<body>
<main>
<h1>{{ title }}</h1>
<p>{{ description }}</p>
<h2>posts</h2>
<!-- prettier-ignore -->
<div>
{% for post in posts %}
<p>
<a href="/posts/{{ post.name }}"><b>{{ post.title }}</b></a>
<span class="post-author">- by {{ post.author }}</span>
<br />
{{ post.description }}<br />
{% match post.created_at %} {% when Some(created_at) %}
written:&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; {{ created_at|date }}<br />
{% when None %} {% endmatch %}
{% match post.modified_at %} {% when Some(modified_at) %}
last modified: {{ modified_at|date }}
{% when None %} {% endmatch %}
</p>
{% endfor %}
</div>
</main>
</body>
</html>

71
templates/post.hbs Normal file
View file

@ -0,0 +1,71 @@
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<meta name="author" content="{{meta.author}}" />
<meta name="keywords" content="{{joined_tags}}" />
<meta name="description" content="{{meta.title}}" />
<!-- you know what I really love? platforms like discord
favoring twitter embeds over the open standard. to color
your embed or have large images, you have to do _this_. lmao -->
<meta property="og:title" content="{{meta.title}}" />
<meta property="twitter:title" content="{{meta.title}}" />
<meta property="og:description" content="{{meta.description}}" />
<meta property="twitter:description" content="{{meta.description}}" />
{{#if (ne meta.icon null)}}
<meta property="og:image" content="{{meta.icon}}" />
<meta name="twitter:card" content="summary_large_image" />
<meta property="twitter:image:src" content="{{meta.icon}}" />
{{#if (ne meta.icon_alt null)}}
<meta property="og:image:alt" content="{{meta.icon_alt}}" />
<meta property="twitter:image:alt" content="{{meta.icon_alt}}" />
{{/if}}{{/if}}
{{#if (ne color null)}}
<meta name="theme-color" content="{{color}}" />
{{/if}}
<title>{{meta.title}}</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/post.css" />
<link rel="stylesheet" href="/static/custom/style.css" />
<link rel="stylesheet" href="/static/custom/post.css" />
{{#if js}}
<script src="/static/date.js" defer></script>
<script src="/static/main.js" defer></script>
{{/if}}
</head>
<body>
<main>
<h1 class="post-title">
{{meta.title}}
<span class="post-author">- by {{meta.author}}</span>
</h1>
<p class="post-desc">{{meta.description}}</p>
<div class="post">
{{>post_table meta style=@root.style}}
<a href="/posts/{{meta.name}}">link</a><br />
<a href="/">back to home</a>
</div>
<hr />
{{{rendered}}}
</main>
<footer>
{{#each rendered_in}}
{{#if (eq @key "ParsedAndRendered")}}
<span class="tooltipped" title="parsing took {{duration this.[1]}}">parsed</span>
and
<span class="tooltipped" title="rendering took {{duration this.[2]}}">rendered</span>
in
{{duration this.[0]}}
{{else if (eq @key "Cached")}}
retrieved from cache in
{{duration this}}
{{/if}}
{{/each}}
{{#if markdown_access}}
-
<a href="/posts/{{meta.name}}.md">view raw</a>
{{/if}}
</footer>
</body>
</html>

View file

@ -1,20 +0,0 @@
<h1 class="post-title">
{{ meta.title }}
<span class="post-author">- by {{ meta.author }}</span>
</h1>
<p class="post-desc">{{ meta.description }}</p>
<p>
<!-- prettier-ignore -->
<div>
{% match meta.created_at %} {% when Some(created_at) %}
written:&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; {{ created_at|date }}<br />
{% when None %} {% endmatch %}
{% match meta.modified_at %} {% when Some(modified_at) %}
last modified: {{ modified_at|date }}
{% when None %} {% endmatch %}
</div>
<a href="/posts/{{ meta.name }}">link</a><br />
<a href="/">back to home</a>
</p>
<hr />
{{ rendered_markdown|escape("none") }}

View file

@ -1,38 +0,0 @@
<!DOCTYPE html>
<html lang="en">
<head>
<head>
<meta charset="UTF-8" />
<meta
name="viewport"
content="width=device-width, initial-scale=1.0"
/>
<meta name="description" content="{{ meta.title }}" />
<meta property="og:title" content="{{ meta.title }}" />
<meta property="og:description" content="{{ meta.description }}" />
{% match meta.icon %} {% when Some with (url) %}
<meta property="og:image" content="{{ url }}" />
<link rel="shortcut icon" href="{{ url }}" />
{% when None %} {% endmatch %}
<title>{{ meta.title }}</title>
<link rel="stylesheet" href="/static/style.css" />
<link rel="stylesheet" href="/static/post.css" />
</head>
</head>
<body>
<main>{{ rendered|escape("none") }}</main>
<!-- prettier-ignore -->
<footer>
{% match rendered_in %}
{% when RenderStats::ParsedAndRendered(total, parsing, rendering) %}
<span class="tooltipped" title="parsing took {{ parsing|duration }}">parsed</span> and
<span class="tooltipped" title="rendering took {{ rendering|duration }}">rendered</span> in {{ total|duration }}
{% when RenderStats::Cached(total) %}
retrieved from cache in {{ total|duration }}
{% endmatch %}
{% if markdown_access %}
- <a href="/posts/{{ meta.name }}.md">view raw</a>
{% endif %}
</footer>
</body>
</html>