65 Commits

Author SHA1 Message Date
3e24c3936a wip: added project modal
still want to add carousel, summary, etc.
2025-12-02 14:07:18 -05:00
82cf30447b added color accents to cards, underlines to subheaders
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 21m47s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 23m39s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m34s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-31 22:45:23 -04:00
fb071df6e4 modified card colors
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 21m39s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 24m46s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m14s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-28 13:19:07 -04:00
49374b646b fixed frontend header background color
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 22m59s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 23m41s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m11s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-27 15:49:37 -04:00
a48396f1bc adjusted the header, footer
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 26m3s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 26m40s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m6s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-27 00:35:01 -04:00
4762a28cc2 added config file to task
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 22m5s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 24m27s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m2s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-19 22:26:52 -04:00
4f8e7a654c modified dockerfiles some more
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 21m10s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 23m37s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m1s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-19 21:52:36 -04:00
88e6b3f1ee modified dockerfiles
Some checks failed
Build and Release Docker Images / build-and-push (./backend, public/Dockerfile, my-website-v2_public) (push) Failing after 20m55s
Build and Release Docker Images / build-and-push (./backend, task/Dockerfile, my-website-v2_task) (push) Failing after 23m36s
Build and Release Docker Images / build-and-push (./frontend, Dockerfile, my-website-v2_frontend) (push) Failing after 17m1s
Build and Release Docker Images / create-release (push) Has been skipped
2025-07-19 21:16:03 -04:00
d590fc505f modified docker compose 2025-07-19 17:38:13 -04:00
9fd936cc31 thriteenth pass in ci 2025-07-19 16:52:14 -04:00
f3237c6171 throwing in the towel v2 2025-07-19 16:40:28 -04:00
ea7881416d throwing in the towel 2025-07-19 16:22:12 -04:00
c98aed3f60 eleventh pass in ci 2025-07-19 16:12:04 -04:00
59e10f779f tenth pass in ci 2025-07-19 15:59:37 -04:00
4dd094bee0 ninth pass in ci 2025-07-19 15:41:29 -04:00
2464d2ab17 eighth pass at ci 2025-07-19 15:27:05 -04:00
7e9cf9a7ec i am gonna lose it
aghhhhhhhhhh
2025-07-19 15:19:16 -04:00
9b1e742db4 missed something 2025-07-19 15:13:53 -04:00
57da15f1e0 sixth pass at ci 2025-07-19 15:11:43 -04:00
1e77fbfd4d fifth pass at ci 2025-07-19 14:00:47 -04:00
b2e22d4b6b fourth pass at ci 2025-07-19 02:05:57 -04:00
3539ceced3 third pass of ci 2025-07-19 01:51:09 -04:00
a83c13a214 second pass of ci 2025-07-19 01:47:09 -04:00
613adcb4c4 modified dockerfiles 2025-07-19 01:28:21 -04:00
8f0ba5289a first pass of continuous integration 2025-07-19 01:10:25 -04:00
79cc4caa58 added dockerfiles
for building the stuff
2025-07-19 00:44:28 -04:00
6eea6724bf cleanup 2025-07-19 00:44:08 -04:00
d73e572527 wip: added docker compose file 2025-07-17 21:23:46 -04:00
3014c1f841 fixed share link button 2025-07-17 21:23:16 -04:00
742a10fb9b removed hot posts
bc comments are not implemented yet
2025-07-17 21:22:12 -04:00
7f04dabf92 fleshed out publish date 2025-07-17 21:21:33 -04:00
bc8e093651 added routes for rss, sitemap, updated footer 2025-07-17 20:09:01 -04:00
71b3b1f42d added publish date 2025-07-17 19:35:45 -04:00
7875c23b67 Merge branch 'master' of https://scm.wyattjmiller.com/wymiller/my-website-v2 2025-07-17 12:38:36 -04:00
48d9360a69 flake update 2025-07-17 12:37:58 -04:00
ef0e9d3777 Merge pull request 'Add pagination' (#3) from pagination into master
Reviewed-on: #3
2025-07-16 20:48:34 -05:00
92b14e63f2 updated deps 2025-07-16 22:02:41 -04:00
baccca1cfa Merge remote-tracking branch 'origin' into pagination 2025-07-16 21:57:52 -04:00
109f8826ff reverted back to old postcard
onClick doesn't want to work, even with the islands, so back to the
drawing board
2025-07-15 23:20:34 -04:00
d58538599c added different favicon 2025-07-14 23:39:47 -04:00
d53f3da4c6 added cache, s3 to taskmanager, ask cache if result is the same, among others 2025-07-14 23:30:29 -04:00
57952ec41d added documentation
it's nothing too novel, okay?
2025-07-14 20:35:49 -04:00
42dff3f186 switched public api to caching library 2025-07-14 20:26:17 -04:00
f4937dc382 create caching library 2025-07-14 20:25:36 -04:00
13d022d44c adjusted css 2025-07-14 20:09:10 -04:00
29011c8f48 added new method to s3clientconfig struct, added from_env to be default 2025-07-14 18:21:13 -04:00
6694f47d70 added pagination to a given authors page 2025-07-07 21:05:27 -04:00
a64b8fdceb impl storage library 2025-06-30 22:58:52 -04:00
a6b4f6917b added storage library 2025-06-30 22:58:25 -04:00
585728de9d fixing tasks upload_sitemap
among other things
2025-06-30 20:38:58 -04:00
5a6346617a remove env 2025-06-30 18:41:22 -04:00
957858de59 added post container 2025-06-30 00:23:41 -04:00
f3c96e675b Merge pull request 'Tasks MR' (#2) from import into master
Reviewed-on: #2
2025-06-29 22:29:19 -05:00
9001c588a0 Merge branch 'master' into import 2025-06-29 22:29:08 -05:00
82e118a0e5 stuff happened 2025-06-29 23:41:20 -04:00
3600166dc5 set inner html for correct display 2025-05-17 21:07:56 -04:00
1503db9509 working import task
wip: not complete, got to fix some of markdown options
2025-05-17 21:04:56 -04:00
c8273016ee switched deprecated dep for maintained dep 2025-05-17 21:04:22 -04:00
6f5b9d4106 cleanup 2025-03-23 12:19:33 -04:00
f0c2f6f3e3 wip: task scheduler works, import task does not work
lol
2025-03-23 12:15:35 -04:00
15a203acf2 moved datetime functions to utils dir 2025-03-17 07:15:36 -04:00
1c461ddb9d fixed mismatched borders on post header, added flex wrap to nav bar 2025-03-17 00:10:58 -04:00
2487a0f421 Merge pull request 'Added caching system' (#1) from caching into master
Reviewed-on: #1
2025-03-16 21:16:52 -05:00
cb6b182042 added s3 crate 2025-03-15 18:02:26 -04:00
3cd6b6e8b3 added health, fallback not found endpoints 2025-03-15 18:02:26 -04:00
103 changed files with 11498 additions and 1318 deletions

104
.github/workflows/build.yaml vendored Normal file
View File

@@ -0,0 +1,104 @@
name: Build and Release Docker Images
on:
push:
branches: [master]
pull_request:
branches: [master]
workflow_dispatch:
env:
REGISTRY: scm.wyattjmiller.com
USERNAME: wymiller # Define username here to use consistently
jobs:
build-and-push:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
strategy:
matrix:
include:
- dockerfile: public/Dockerfile
image: my-website-v2_public
context: ./backend
- dockerfile: task/Dockerfile
image: my-website-v2_task
context: ./backend
- dockerfile: Dockerfile
image: my-website-v2_frontend
context: ./frontend
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Log in to Container Registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ secrets.GH_ACTION_USERNAME }}
password: ${{ secrets.GH_ACTION_TOKEN }}
- name: Extract metadata
id: meta
uses: docker/metadata-action@v5
with:
images: scm.wyattjmiller.com/wymiller/${{ matrix.image }}
tags: |
type=ref,event=branch
type=ref,event=pr
type=semver,pattern={{version}}
type=semver,pattern={{major}}.{{minor}}
type=semver,pattern={{major}}
- name: Build and push Docker image
uses: docker/build-push-action@v5
with:
context: ${{ matrix.context }}
file: ${{ matrix.context }}/${{ matrix.dockerfile }}
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
cache-from: type=gha
cache-to: type=gha,mode=max
platforms: linux/amd64
create-release:
if: startsWith(github.ref, 'refs/tags/')
needs: build-and-push
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Create Release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GH_ACTION_TOKEN }}
with:
tag_name: ${{ github.ref_name }}
release_name: Release ${{ github.ref_name }}
body: |
## Docker Images Released
The following Docker images have been built and pushed to the container registry:
- `${{ env.REGISTRY }}/${{ env.USERNAME }}/my-website-v2_public:${{ github.ref_name }}`
- `${{ env.REGISTRY }}/${{ env.USERNAME }}/my-website-v2_task:${{ github.ref_name }}`
- `${{ env.REGISTRY }}/${{ env.USERNAME }}/my-website-v2_frontend:${{ github.ref_name }}`
### Usage
```bash
docker pull ${{ env.REGISTRY }}/${{ env.USERNAME }}/my-website-v2_public:${{ github.ref_name }}
docker pull ${{ env.REGISTRY }}/${{ env.USERNAME }}/my-website-v2_task:${{ github.ref_name }}
docker pull ${{ env.REGISTRY }}/${{ env.USERNAME }}/my-website-v2_frontend:${{ github.ref_name }}
```
draft: false
prerelease: false

View File

@@ -8,3 +8,5 @@ This is just an orginizational way of keeping the backend services together (so
- [`public`](./public/README.md) - a RESTful API service - [`public`](./public/README.md) - a RESTful API service
- [`task`](./task/README.md) - a task scheduler service - [`task`](./task/README.md) - a task scheduler service
- [`storage`](./storage/README.md) - an internal storage library
- [`cache`](./cache/README.md) - an internal caching library

2
backend/cache/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
target/
.env

1014
backend/cache/Cargo.lock generated vendored Normal file

File diff suppressed because it is too large Load Diff

9
backend/cache/Cargo.toml vendored Normal file
View File

@@ -0,0 +1,9 @@
[package]
name = "cache"
version = "0.1.0"
edition = "2024"
[dependencies]
fred = "10.1.0"
serde = "1.0.219"
serde_json = "1.0.140"

7
backend/cache/README.md vendored Normal file
View File

@@ -0,0 +1,7 @@
# Caching library
also known as `cache`
## What is this?
An internal caching library that houses functionality needed for a key-value database like Redis or Valkey. This was turned into a library because both `public` and `task` needed functionality within.

70
backend/cache/src/lib.rs vendored Normal file
View File

@@ -0,0 +1,70 @@
pub use fred::{
clients::Pool,
interfaces::{ClientLike, KeysInterface},
prelude::*,
types::{Expiration, SetOptions},
};
pub struct Cache {
pub inmem: Pool,
}
impl Cache {
pub async fn get<T>(&mut self, key: String) -> Result<Option<T>, Box<dyn std::error::Error>>
where
T: for<'de> serde::Deserialize<'de>,
{
self.is_connected()?;
let value: Option<String> = self.inmem.get(&key).await?;
match value {
Some(json_str) => match serde_json::from_str::<T>(&json_str) {
Ok(deserialized) => Ok(Some(deserialized)),
Err(_) => Ok(None),
},
None => Ok(None),
}
}
pub async fn set<T>(
&mut self,
key: String,
contents: &T,
expiration: Option<Expiration>,
set_opts: Option<SetOptions>,
get: bool,
) -> Result<(), Box<dyn std::error::Error>>
where
T: for<'de> serde::Deserialize<'de> + serde::Serialize,
{
self.is_connected()?;
let json_string = match serde_json::to_string::<T>(contents) {
Ok(s) => s,
Err(_) => {
return Err(Box::new(std::io::Error::new(
std::io::ErrorKind::Other,
"Unable to deserialize contents passed to cache".to_string(),
)));
}
};
Ok(self
.inmem
.set(key, json_string, expiration, set_opts, get)
.await?)
}
pub async fn del(&mut self, key: String) -> Result<(), Box<dyn std::error::Error>> {
Ok(self.inmem.del(key).await?)
}
fn is_connected(&mut self) -> Result<(), Box<dyn std::error::Error>> {
match self.inmem.is_connected() {
true => Ok(()),
false => Err(Box::new(std::io::Error::new(
std::io::ErrorKind::Other,
"Not connected to cache".to_string(),
))),
}
}
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL ORDER BY p.created_at DESC LIMIT 10",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "publish_date",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
true,
false,
false,
false,
false,
true,
true
]
},
"hash": "053f5b53a743065aa0105903cdd0ec803861a2477c38a02754d2d350a34aaa68"
}

View File

@@ -0,0 +1,70 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date, p.is_featured FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.post_id = $1 ORDER BY p.created_at DESC",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "publish_date",
"type_info": "Timestamptz"
},
{
"ordinal": 8,
"name": "is_featured",
"type_info": "Bool"
}
],
"parameters": {
"Left": [
"Int4"
]
},
"nullable": [
false,
true,
false,
false,
false,
false,
true,
true,
false
]
},
"hash": "0891ec97ff1d5d5ab7fbc848ceb4e7ea4f46e2f6282170dfdf90ebc6ab6d5fd9"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL GROUP BY p.post_id, a.first_name, a.last_name ORDER BY p.created_at DESC LIMIT 3",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "publish_date",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
true,
false,
false,
false,
false,
true,
true
]
},
"hash": "0ec6c9d94fceba56112e78c82acc56ae01bc3c641e28ee21e331c06e2fd9e551"
}

View File

@@ -0,0 +1,42 @@
{
"db_name": "PostgreSQL",
"query": "INSERT INTO comments (post_id, name, body) VALUES ($1, $2, $3) RETURNING comment_id, name, body, created_at",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "comment_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "name",
"type_info": "Varchar"
},
{
"ordinal": 2,
"name": "body",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "created_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Int4",
"Varchar",
"Varchar"
]
},
"nullable": [
false,
false,
false,
true
]
},
"hash": "1f5f18ecc0f1fe0ea93ca61e3f167640a56fee610379de45017f2608094867f0"
}

View File

@@ -0,0 +1,66 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date, a.author_id FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.author_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 2,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 6,
"name": "publish_date",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "author_id",
"type_info": "Int4"
}
],
"parameters": {
"Left": [
"Int4",
"Int8",
"Int8"
]
},
"nullable": [
false,
false,
false,
false,
false,
true,
true,
false
]
},
"hash": "3831b52c2db3d1114c4b01a761c74168b66904bacff847844d463454b7fcde43"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id LEFT JOIN comments c ON p.post_id = c.post_id WHERE p.deleted_at IS NULL GROUP BY p.post_id, a.first_name, a.last_name ORDER BY COUNT(c.comment_id) DESC LIMIT 3",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "publish_date",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
true,
false,
false,
false,
false,
true,
true
]
},
"hash": "49768c8b986078bdfaad191b3ea1f07ca033b2a734162a3f8fcf0ef0a44c1e7f"
}

View File

@@ -0,0 +1,41 @@
{
"db_name": "PostgreSQL",
"query": "SELECT comment_id, name, body, created_at FROM comments ORDER BY created_at DESC LIMIT $1 OFFSET $2",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "comment_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "name",
"type_info": "Varchar"
},
{
"ordinal": 2,
"name": "body",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "created_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Int8",
"Int8"
]
},
"nullable": [
false,
false,
false,
true
]
},
"hash": "4e39696c45b7533e519452425b5a69d607fd8b99a526002ece8978ccb41f2c69"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "SELECT COUNT(*) FROM posts p WHERE p.deleted_at IS NULL AND p.author_id = $1",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "count",
"type_info": "Int8"
}
],
"parameters": {
"Left": [
"Int4"
]
},
"nullable": [
null
]
},
"hash": "51fff32b503c65e62320071ff3ec44060b5fb45049b4f489c9a9d92e592ab5a7"
}

View File

@@ -0,0 +1,46 @@
{
"db_name": "PostgreSQL",
"query": "SELECT author_id, first_name, last_name, bio, image FROM authors WHERE author_id = $1",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 2,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "bio",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "image",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Int4"
]
},
"nullable": [
false,
false,
false,
true,
true
]
},
"hash": "9c0f74750e0f90916b3d2f85d0264e27523c14dff7b7adccd5b4cfbb36918901"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.is_featured IS true GROUP BY p.post_id, a.first_name, a.last_name ORDER BY p.created_at DESC LIMIT 3",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "publish_date",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
true,
false,
false,
false,
false,
true,
true
]
},
"hash": "9d93a8a7c0a2442a511108af36d4adfb1ef8a2fac82448205654742f43dc4e75"
}

View File

@@ -0,0 +1,62 @@
{
"db_name": "PostgreSQL",
"query": "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL ORDER BY p.view_count DESC LIMIT 3",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "post_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 4,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "body",
"type_info": "Text"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 7,
"name": "publish_date",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
true,
false,
false,
false,
false,
true,
true
]
},
"hash": "ad39df8c37105f13b620f8898e570cdbc54d4bd4e402aac65a28c9aa81803831"
}

View File

@@ -0,0 +1,40 @@
{
"db_name": "PostgreSQL",
"query": "SELECT c.comment_id, c.name, c.body, c.created_at FROM comments c LEFT JOIN posts p ON p.post_id = c.post_id WHERE p.post_id = $1 AND c.deleted_at IS NULL ORDER BY created_at DESC LIMIT 20",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "comment_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "name",
"type_info": "Varchar"
},
{
"ordinal": 2,
"name": "body",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "created_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Int4"
]
},
"nullable": [
false,
false,
false,
true
]
},
"hash": "ae5c1527389fd823f46d3b23e5ab3b8211a6323ceff845487abae26096b3fa01"
}

View File

@@ -0,0 +1,47 @@
{
"db_name": "PostgreSQL",
"query": "SELECT author_id, first_name, last_name, bio, image FROM authors ORDER BY created_at DESC LIMIT $1 OFFSET $2",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "author_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "first_name",
"type_info": "Varchar"
},
{
"ordinal": 2,
"name": "last_name",
"type_info": "Varchar"
},
{
"ordinal": 3,
"name": "bio",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "image",
"type_info": "Text"
}
],
"parameters": {
"Left": [
"Int8",
"Int8"
]
},
"nullable": [
false,
false,
false,
true,
true
]
},
"hash": "e6764f22ac7966bdb64386aedffb9edb89aefb248a1f980d2d4e2e20b1c3ca50"
}

View File

@@ -0,0 +1,56 @@
{
"db_name": "PostgreSQL",
"query": "SELECT project_id, title, repo, summary, tech, wip, created_at FROM projects p WHERE deleted_at IS NULL ORDER BY p.created_at DESC",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "project_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "title",
"type_info": "Text"
},
{
"ordinal": 2,
"name": "repo",
"type_info": "Text"
},
{
"ordinal": 3,
"name": "summary",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "tech",
"type_info": "Text"
},
{
"ordinal": 5,
"name": "wip",
"type_info": "Bool"
},
{
"ordinal": 6,
"name": "created_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": []
},
"nullable": [
false,
false,
true,
false,
false,
true,
true
]
},
"hash": "ed764b77d39df0583dc05c3ca721176b8c38e5df5fb078a53b808080c865e64d"
}

1147
backend/public/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -7,7 +7,7 @@ authors = ["Wyatt J. Miller <wyatt@wyattjmiller.com"]
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
axum = { version = "0.7.6", features = ["http2", "tokio"] } axum = { version = "0.8.4", features = ["http2", "tokio"] }
tower-http = { version = "0.6.1", features = ["trace", "cors"] } tower-http = { version = "0.6.1", features = ["trace", "cors"] }
tower_governor = "0.4.2" tower_governor = "0.4.2"
tokio = { version = "1.40.0", features = ["full"] } tokio = { version = "1.40.0", features = ["full"] }
@@ -25,3 +25,4 @@ serde_json = "1.0.128"
chrono = "0.4.38" chrono = "0.4.38"
xml = "0.8.20" xml = "0.8.20"
fred = "10.1.0" fred = "10.1.0"
cache = { version = "*", path = "../cache" }

12
backend/public/Dockerfile Normal file
View File

@@ -0,0 +1,12 @@
FROM rust:1.88.0
WORKDIR /app
COPY ./public ./public
COPY ./cache ./cache
RUN cargo build --release --manifest-path ./public/Cargo.toml
EXPOSE 3000
CMD ["/app/public/target/release/public"]

View File

@@ -1,6 +1,9 @@
use sqlx::{Pool, Postgres}; use sqlx::{Pool, Postgres};
use crate::routes::{authors::Author, comments::Pagination, posts::Post}; use crate::{
routes::{authors::Author, posts::Post},
utils::pagination::Pagination,
};
pub struct AuthorsDatasource; pub struct AuthorsDatasource;
impl AuthorsDatasource { impl AuthorsDatasource {
@@ -8,11 +11,11 @@ impl AuthorsDatasource {
pool: &Pool<Postgres>, pool: &Pool<Postgres>,
pagination: Pagination, pagination: Pagination,
) -> Result<Vec<Author>, sqlx::Error> { ) -> Result<Vec<Author>, sqlx::Error> {
let offset: i64 = (pagination.page_number - 1) * pagination.page_size; let offset: i64 = (pagination.page - 1) * pagination.limit;
sqlx::query_as!( sqlx::query_as!(
Author, Author,
"SELECT author_id, first_name, last_name, bio, image FROM authors ORDER BY created_at DESC LIMIT $1 OFFSET $2", "SELECT author_id, first_name, last_name, bio, image FROM authors ORDER BY created_at DESC LIMIT $1 OFFSET $2",
pagination.page_size, pagination.page,
offset, offset,
) )
.fetch_all(pool) .fetch_all(pool)
@@ -32,13 +35,32 @@ impl AuthorsDatasource {
pub async fn get_authors_posts( pub async fn get_authors_posts(
pool: &Pool<Postgres>, pool: &Pool<Postgres>,
author_id: i32, author_id: i32,
) -> Result<Vec<Post>, sqlx::Error> { pagination: Pagination,
sqlx::query_as!( ) -> Result<(Vec<Post>, i64), sqlx::Error> {
Post, let offset: i64 = (pagination.page - 1) * pagination.limit;
"SELECT p.post_id, a.first_name, a.last_name, p.title, p.body, p.created_at, a.author_id FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.author_id = $1 ORDER BY created_at DESC", println!(
"Author ID: {}, Page: {}, Size: {}, Offset: {}",
author_id, pagination.page, pagination.limit, offset
);
let total_count = sqlx::query_scalar!(
"SELECT COUNT(*) FROM posts p WHERE p.deleted_at IS NULL AND p.author_id = $1",
author_id author_id
) )
.fetch_one(pool)
.await?
.unwrap_or(0);
let posts_query = sqlx::query_as!(
Post,
"SELECT p.post_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date, a.author_id FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.author_id = $1 ORDER BY created_at DESC LIMIT $2 OFFSET $3",
author_id,
pagination.limit,
offset,
)
.fetch_all(pool) .fetch_all(pool)
.await .await?;
Ok((posts_query, total_count))
} }
} }

View File

@@ -1,4 +1,7 @@
use crate::routes::comments::{Comment, CommentInputPayload, Pagination}; use crate::{
routes::comments::{Comment, CommentInputPayload},
utils::pagination::Pagination,
};
use sqlx::{Pool, Postgres}; use sqlx::{Pool, Postgres};
pub struct CommentsDatasource; pub struct CommentsDatasource;
@@ -25,8 +28,8 @@ impl CommentsDatasource {
pool: &Pool<Postgres>, pool: &Pool<Postgres>,
pagination: Pagination, pagination: Pagination,
) -> Result<Vec<Comment>, sqlx::Error> { ) -> Result<Vec<Comment>, sqlx::Error> {
let offset: i64 = (pagination.page_number - 1) * pagination.page_size; let offset: i64 = (pagination.page - 1) * pagination.limit;
sqlx::query_as!(Comment, "SELECT comment_id, name, body, created_at FROM comments ORDER BY created_at DESC LIMIT $1 OFFSET $2", pagination.page_size, offset) sqlx::query_as!(Comment, "SELECT comment_id, name, body, created_at FROM comments ORDER BY created_at DESC LIMIT $1 OFFSET $2", pagination.page, offset)
.fetch_all(pool) .fetch_all(pool)
.await .await
} }

View File

@@ -1,3 +1,4 @@
pub mod authors; pub mod authors;
pub mod comments; pub mod comments;
pub mod posts; pub mod posts;
pub mod projects;

View File

@@ -5,7 +5,7 @@ use crate::routes::posts::{Post, PostFeaturedVariant};
pub struct PostsDatasource; pub struct PostsDatasource;
impl PostsDatasource { impl PostsDatasource {
pub async fn get_all(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> { pub async fn get_all(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> {
sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL ORDER BY p.created_at DESC LIMIT 10") sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL ORDER BY p.created_at DESC LIMIT 10")
.fetch_all(pool) .fetch_all(pool)
.await .await
} }
@@ -19,31 +19,31 @@ impl PostsDatasource {
.execute(pool) .execute(pool)
.await; .await;
sqlx::query_as!(PostFeaturedVariant, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.is_featured FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.post_id = $1 ORDER BY p.created_at DESC", post_id) sqlx::query_as!(PostFeaturedVariant, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date, p.is_featured FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.post_id = $1 ORDER BY p.created_at DESC", post_id)
.fetch_one(pool) .fetch_one(pool)
.await .await
} }
pub async fn get_recent(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> { pub async fn get_recent(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> {
sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL GROUP BY p.post_id, a.first_name, a.last_name ORDER BY p.created_at DESC LIMIT 3") sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL GROUP BY p.post_id, a.first_name, a.last_name ORDER BY p.created_at DESC LIMIT 3")
.fetch_all(pool) .fetch_all(pool)
.await .await
} }
pub async fn get_popular(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> { pub async fn get_popular(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> {
sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id LEFT JOIN comments c ON p.post_id = c.post_id WHERE p.deleted_at IS NULL GROUP BY p.post_id, a.first_name, a.last_name ORDER BY COUNT(c.comment_id) DESC LIMIT 3") sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id LEFT JOIN comments c ON p.post_id = c.post_id WHERE p.deleted_at IS NULL GROUP BY p.post_id, a.first_name, a.last_name ORDER BY COUNT(c.comment_id) DESC LIMIT 3")
.fetch_all(pool) .fetch_all(pool)
.await .await
} }
pub async fn get_hot(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> { pub async fn get_hot(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> {
sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL ORDER BY p.view_count DESC LIMIT 3") sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL ORDER BY p.view_count DESC LIMIT 3")
.fetch_all(pool) .fetch_all(pool)
.await .await
} }
pub async fn get_featured(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> { pub async fn get_featured(pool: &Pool<Postgres>) -> Result<Vec<Post>, sqlx::Error> {
sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.is_featured IS true GROUP BY p.post_id, a.first_name, a.last_name ORDER BY p.created_at DESC LIMIT 3") sqlx::query_as!(Post, "SELECT p.post_id, p.author_id, a.first_name, a.last_name, p.title, p.body, p.created_at, p.publish_date FROM posts p LEFT JOIN authors a ON a.author_id = p.author_id WHERE p.deleted_at IS NULL AND p.is_featured IS true GROUP BY p.post_id, a.first_name, a.last_name ORDER BY p.created_at DESC LIMIT 3")
.fetch_all(pool) .fetch_all(pool)
.await .await
} }

View File

@@ -0,0 +1,15 @@
use sqlx::{FromRow, Pool, Postgres, Row};
use crate::routes::projects::Project;
pub struct ProjectsDatasource;
impl ProjectsDatasource {
pub async fn get_all(pool: &Pool<Postgres>) -> Result<Vec<Project>, sqlx::Error> {
sqlx::query_as!(
Project,
"SELECT project_id, title, repo, summary, tech, wip, created_at FROM projects p WHERE deleted_at IS NULL ORDER BY p.created_at DESC"
)
.fetch_all(pool)
.await
}
}

View File

@@ -1,6 +1,6 @@
use axum::Router; use axum::Router;
use cache::ClientLike;
use config::config; use config::config;
use fred::prelude::*;
use sqlx::postgres::PgPoolOptions; use sqlx::postgres::PgPoolOptions;
use std::fs::File; use std::fs::File;
use std::sync::Arc; use std::sync::Arc;
@@ -8,9 +8,9 @@ use std::time::Duration;
use tokio::net::TcpListener; use tokio::net::TcpListener;
use tokio::signal; use tokio::signal;
use tokio::sync::Mutex; use tokio::sync::Mutex;
use tower_governor::{governor::GovernorConfigBuilder, GovernorLayer}; // use tower_governor::{governor::GovernorConfigBuilder, GovernorLayer};
use tower_http::{ use tower_http::{
cors::{Any, CorsLayer}, cors::CorsLayer,
trace::{self, TraceLayer}, trace::{self, TraceLayer},
}; };
use tracing_subscriber::{filter, layer::SubscriberExt, prelude::*, util::SubscriberInitExt}; use tracing_subscriber::{filter, layer::SubscriberExt, prelude::*, util::SubscriberInitExt};
@@ -58,11 +58,6 @@ async fn main() {
) )
.init(); .init();
let cors = CorsLayer::new()
.allow_methods(Any)
.allow_headers(Any)
.allow_origin(Any);
// if std::env::var("RUST_ENV").unwrap_or_else(|_| "development".to_string()) != "development" { // if std::env::var("RUST_ENV").unwrap_or_else(|_| "development".to_string()) != "development" {
//println!("we're not in development, starting up the rate limiter"); //println!("we're not in development, starting up the rate limiter");
//let governor_conf = Arc::new( //let governor_conf = Arc::new(
@@ -101,13 +96,13 @@ async fn main() {
.expect("Failed to connect to database"); .expect("Failed to connect to database");
let pool_size = 8; let pool_size = 8;
let config = Config::from_url(&redis_url).unwrap(); // TODO: fix the unwrap <<< let config = cache::Config::from_url(&redis_url).unwrap(); // TODO: fix the unwrap <<<
let redis_pool = Builder::from_config(config) let redis_pool = cache::Builder::from_config(config)
.with_performance_config(|config| { .with_performance_config(|config| {
config.default_command_timeout = Duration::from_secs(60); config.default_command_timeout = Duration::from_secs(60);
}) })
.set_policy(ReconnectPolicy::new_exponential(0, 100, 30_000, 2)) .set_policy(cache::ReconnectPolicy::new_exponential(0, 100, 30_000, 2))
.build_pool(pool_size) .build_pool(pool_size)
.expect("Failed to create cache pool"); .expect("Failed to create cache pool");
@@ -121,23 +116,18 @@ async fn main() {
// build our application with some routes // build our application with some routes
let app = Router::new() let app = Router::new()
.nest("/", routes::root::RootRoute::routes()) .merge(routes::root::RootRoute::routes())
.nest("/posts", routes::posts::PostsRoute::routes(&app_state)) .merge(routes::posts::PostsRoute::routes(&app_state))
.nest( .merge(routes::comments::CommentsRoute::routes(&app_state))
"/comments", .merge(routes::authors::AuthorsRoute::routes(&app_state))
routes::comments::CommentsRoute::routes(&app_state), .merge(routes::projects::ProjectsRoute::routes(&app_state))
)
.nest(
"/authors",
routes::authors::AuthorsRoute::routes(&app_state),
)
.layer(CorsLayer::permissive()) .layer(CorsLayer::permissive())
.layer( .layer(
TraceLayer::new_for_http() TraceLayer::new_for_http()
.make_span_with(trace::DefaultMakeSpan::new().level(tracing::Level::INFO)) .make_span_with(trace::DefaultMakeSpan::new().level(tracing::Level::INFO))
.on_response(trace::DefaultOnResponse::new().level(tracing::Level::INFO)), .on_response(trace::DefaultOnResponse::new().level(tracing::Level::INFO)),
); )
// .layer(cors); .fallback(routes::root::RootRoute::not_found);
//.layer(GovernorLayer { //.layer(GovernorLayer {
// config: governor_conf, // config: governor_conf,
//}); //});

View File

@@ -1,16 +1,19 @@
use axum::{ use axum::{
extract::{Path, State}, extract::{Path, Query, State},
http::StatusCode, http::StatusCode,
response::IntoResponse, response::IntoResponse,
routing::get, routing::get,
Json, Json,
}; };
use fred::types::Expiration; use cache::Expiration;
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
use crate::{datasources::authors::AuthorsDatasource, state::AppState}; use crate::{
datasources::authors::AuthorsDatasource,
use super::comments::Pagination; routes::posts::Post,
state::AppState,
utils::pagination::{Pagination, PaginationQuery},
};
#[derive(Deserialize, Serialize, Clone)] #[derive(Deserialize, Serialize, Clone)]
pub struct Author { pub struct Author {
@@ -26,20 +29,31 @@ pub struct AuthorGetOneParams {
pub id: i32, pub id: i32,
} }
#[derive(Deserialize, Serialize)]
pub struct AuthorPostsResponse {
posts: Vec<Post>,
total_posts: i64,
}
pub struct AuthorsRoute; pub struct AuthorsRoute;
impl AuthorsRoute { impl AuthorsRoute {
pub fn routes(app_state: &AppState) -> axum::Router { pub fn routes(app_state: &AppState) -> axum::Router {
axum::Router::new() axum::Router::new()
.route("/", get(AuthorsRoute::get_all)) .route("/authors", get(AuthorsRoute::get_all))
.route("/:id", get(AuthorsRoute::get_one)) .route("/authors/{id}", get(AuthorsRoute::get_one))
.route("/:id/posts", get(AuthorsRoute::get_authors_posts)) .route("/authors/{id}/posts", get(AuthorsRoute::get_authors_posts))
.with_state(app_state.clone()) .with_state(app_state.clone())
} }
async fn get_all( async fn get_all(
State(app_state): State<AppState>, State(app_state): State<AppState>,
Json(pagination): Json<Pagination>, Query(query): Query<PaginationQuery>,
) -> impl IntoResponse { ) -> impl IntoResponse {
let pagination = Pagination {
page: query.page.unwrap_or(1),
limit: query.limit.unwrap_or(12),
};
let mut state = app_state.lock().await; let mut state = app_state.lock().await;
let cached: Option<Vec<Author>> = state let cached: Option<Vec<Author>> = state
.cache .cache
@@ -104,6 +118,7 @@ impl AuthorsRoute {
let state = app_state.clone(); let state = app_state.clone();
tracing::info!("storing database data in cache"); tracing::info!("storing database data in cache");
tokio::spawn(async move { tokio::spawn(async move {
let mut s = state.lock().await; let mut s = state.lock().await;
let _ = s let _ = s
@@ -127,12 +142,20 @@ impl AuthorsRoute {
async fn get_authors_posts( async fn get_authors_posts(
State(app_state): State<AppState>, State(app_state): State<AppState>,
Path(params): Path<AuthorGetOneParams>, Path(params): Path<AuthorGetOneParams>,
Query(pagination): Query<PaginationQuery>,
) -> impl IntoResponse { ) -> impl IntoResponse {
let pagination = Pagination {
page: pagination.page.unwrap_or(1),
limit: pagination.limit.unwrap_or(12),
};
let state = app_state.lock().await; let state = app_state.lock().await;
match AuthorsDatasource::get_authors_posts(&state.database, params.id).await { match AuthorsDatasource::get_authors_posts(&state.database, params.id, pagination).await {
Ok(p) => Ok(Json(p)), Ok((posts, total_posts)) => Ok(Json(AuthorPostsResponse { posts, total_posts })),
Err(e) => Err((StatusCode::INTERNAL_SERVER_ERROR, e.to_string())), Err(e) => Err((StatusCode::INTERNAL_SERVER_ERROR, e.to_string())),
} }
} }
} }

View File

@@ -1,14 +1,20 @@
use super::posts::{deserialize_datetime, serialize_datetime}; use crate::{
use crate::{datasources::comments::CommentsDatasource, state::AppState}; datasources::comments::CommentsDatasource,
state::AppState,
utils::{
datetime::*,
pagination::{Pagination, PaginationQuery},
},
};
use axum::{ use axum::{
extract::{Path, State}, extract::{Path, Query, State},
http::StatusCode, http::StatusCode,
response::IntoResponse, response::IntoResponse,
routing::{get, post}, routing::{get, post},
Json, Json,
}; };
use cache::{Expiration, SetOptions};
use chrono::Utc; use chrono::Utc;
use fred::types::{Expiration, SetOptions};
use serde::{Deserialize, Serialize}; use serde::{Deserialize, Serialize};
#[derive(Deserialize, Serialize, Debug)] #[derive(Deserialize, Serialize, Debug)]
@@ -22,13 +28,6 @@ pub struct CommentInputPayload {
pub struct CommentPathParams { pub struct CommentPathParams {
id: i32, id: i32,
} }
#[derive(Deserialize, Serialize)]
pub struct Pagination {
pub page_number: i64,
pub page_size: i64,
}
#[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)] #[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)]
pub struct Comment { pub struct Comment {
pub comment_id: i32, pub comment_id: i32,
@@ -44,9 +43,9 @@ impl CommentsRoute {
pub fn routes(app_state: &AppState) -> axum::Router { pub fn routes(app_state: &AppState) -> axum::Router {
// add more comment routes here! // add more comment routes here!
axum::Router::new() axum::Router::new()
.route("/post/:id", get(CommentsRoute::get_post_comments)) .route("/comments/post/{id}", get(CommentsRoute::get_post_comments))
.route("/add", post(CommentsRoute::insert_comment)) .route("/comments/add", post(CommentsRoute::insert_comment))
.route("/index", get(CommentsRoute::get_comments_index)) .route("/comments/index", get(CommentsRoute::get_comments_index))
.with_state(app_state.clone()) .with_state(app_state.clone())
} }
@@ -97,8 +96,13 @@ impl CommentsRoute {
async fn get_comments_index( async fn get_comments_index(
State(app_state): State<AppState>, State(app_state): State<AppState>,
Json(pagination): Json<Pagination>, Query(query): Query<PaginationQuery>,
) -> impl IntoResponse { ) -> impl IntoResponse {
let pagination = Pagination {
page: query.page.unwrap_or(1),
limit: query.limit.unwrap_or(12),
};
let state = app_state.lock().await; let state = app_state.lock().await;
match CommentsDatasource::get_index_comments(&state.database, pagination).await { match CommentsDatasource::get_index_comments(&state.database, pagination).await {
@@ -107,3 +111,5 @@ impl CommentsRoute {
} }
} }
} }

View File

@@ -1,4 +1,5 @@
pub mod authors; pub mod authors;
pub mod comments; pub mod comments;
pub mod posts; pub mod posts;
pub mod projects;
pub mod root; pub mod root;

View File

@@ -1,8 +1,12 @@
use std::collections::HashMap; use crate::{
use std::fmt; datasources::posts::PostsDatasource,
state::AppState,
use crate::utils::rss; utils::{
use crate::{datasources::posts::PostsDatasource, state::AppState}; datetime::*,
rss,
sitemap::{self, SitemapEntry},
},
};
use axum::http::{HeaderMap, HeaderValue}; use axum::http::{HeaderMap, HeaderValue};
use axum::{ use axum::{
extract::{Path, State}, extract::{Path, State},
@@ -11,9 +15,10 @@ use axum::{
routing::get, routing::get,
Json, Router, Json, Router,
}; };
use cache::Expiration;
use chrono::Utc; use chrono::Utc;
use fred::types::Expiration; use serde::{Deserialize, Serialize};
use serde::{Deserialize, Deserializer, Serialize, Serializer}; use std::collections::HashMap;
#[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)] #[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)]
pub struct Post { pub struct Post {
@@ -26,6 +31,9 @@ pub struct Post {
#[serde(serialize_with = "serialize_datetime")] #[serde(serialize_with = "serialize_datetime")]
#[serde(deserialize_with = "deserialize_datetime")] #[serde(deserialize_with = "deserialize_datetime")]
pub created_at: Option<chrono::DateTime<Utc>>, pub created_at: Option<chrono::DateTime<Utc>>,
#[serde(serialize_with = "serialize_datetime")]
#[serde(deserialize_with = "deserialize_datetime")]
pub publish_date: Option<chrono::DateTime<Utc>>,
} }
#[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)] #[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)]
@@ -39,6 +47,9 @@ pub struct PostFeaturedVariant {
#[serde(serialize_with = "serialize_datetime")] #[serde(serialize_with = "serialize_datetime")]
#[serde(deserialize_with = "deserialize_datetime")] #[serde(deserialize_with = "deserialize_datetime")]
pub created_at: Option<chrono::DateTime<Utc>>, pub created_at: Option<chrono::DateTime<Utc>>,
#[serde(serialize_with = "serialize_datetime")]
#[serde(deserialize_with = "deserialize_datetime")]
pub publish_date: Option<chrono::DateTime<Utc>>,
pub is_featured: Option<bool>, pub is_featured: Option<bool>,
} }
@@ -52,13 +63,14 @@ impl PostsRoute {
pub fn routes(app_state: &AppState) -> Router { pub fn routes(app_state: &AppState) -> Router {
// add more post routes here! // add more post routes here!
Router::new() Router::new()
.route("/all", get(PostsRoute::get_all)) .route("/posts/all", get(PostsRoute::get_all))
.route("/:id", get(PostsRoute::get_one)) .route("/posts/{id}", get(PostsRoute::get_one))
.route("/recent", get(PostsRoute::get_recent_posts)) .route("/posts/recent", get(PostsRoute::get_recent_posts))
.route("/popular", get(PostsRoute::get_popular_posts)) .route("/posts/popular", get(PostsRoute::get_popular_posts))
.route("/hot", get(PostsRoute::get_hot_posts)) .route("/posts/hot", get(PostsRoute::get_hot_posts))
.route("/featured", get(PostsRoute::get_featured_posts)) .route("/posts/featured", get(PostsRoute::get_featured_posts))
.route("/rss", get(PostsRoute::get_rss_posts)) .route("/posts/rss", get(PostsRoute::get_rss_posts))
.route("/posts/sitemap", get(PostsRoute::get_sitemap))
.with_state(app_state.clone()) .with_state(app_state.clone())
} }
@@ -329,7 +341,8 @@ impl PostsRoute {
match PostsDatasource::get_all(&state.database).await { match PostsDatasource::get_all(&state.database).await {
Ok(posts) => { Ok(posts) => {
let web_url = std::env::var("BASE_URI_WEB").expect("No environment variable found"); let web_url =
std::env::var("BASE_URI_WEB").expect("Environment BASE_URI_WEB variable found");
let mapped_posts: HashMap<String, Post> = posts let mapped_posts: HashMap<String, Post> = posts
.into_iter() .into_iter()
.map(|post| (post.post_id.to_string(), post)) .map(|post| (post.post_id.to_string(), post))
@@ -342,9 +355,42 @@ impl PostsRoute {
); );
let mut headers = HeaderMap::new(); let mut headers = HeaderMap::new();
headers.insert( headers.insert(
header::CONTENT_DISPOSITION, header::CONTENT_TYPE,
HeaderValue::from_str(r#"attachment; filename="posts.xml""#).unwrap(), HeaderValue::from_static("application/xml"),
); );
(headers, xml)
}
Err(e) => {
let mut headers = HeaderMap::new();
headers.insert("Content-Type", HeaderValue::from_static("text/plain"));
(headers, e.to_string())
}
}
}
async fn get_sitemap(State(app_state): State<AppState>) -> impl IntoResponse {
let state = app_state.lock().await;
// let cached: Option<Vec<Post>> = None; // TODO: maybe implement cache, later??
match PostsDatasource::get_all(&state.database).await {
Ok(posts) => {
let web_url =
std::env::var("BASE_URI_WEB").expect("Environment BASE_URI_WEB variable found");
let mut entries: HashMap<String, SitemapEntry> = posts
.into_iter()
.map(|p| {
(
p.post_id.to_string(),
SitemapEntry {
location: format!("{}/posts/{}", web_url, p.post_id.to_string()),
lastmod: p.created_at.unwrap_or_else(|| chrono::Utc::now()),
},
)
})
.collect();
sitemap::get_static_pages(&mut entries, &web_url);
let xml: String = sitemap::generate_sitemap(&entries);
let mut headers = HeaderMap::new();
headers.insert( headers.insert(
header::CONTENT_TYPE, header::CONTENT_TYPE,
HeaderValue::from_static("application/xml"), HeaderValue::from_static("application/xml"),
@@ -359,56 +405,3 @@ impl PostsRoute {
} }
} }
} }
pub fn serialize_datetime<S>(
date: &Option<chrono::DateTime<Utc>>,
serializer: S,
) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(&date.unwrap().to_rfc3339())
}
pub fn deserialize_datetime<'de, D>(
deserializer: D,
) -> Result<Option<chrono::DateTime<Utc>>, D::Error>
where
D: Deserializer<'de>,
{
struct DateTimeVisitor;
impl<'de> serde::de::Visitor<'de> for DateTimeVisitor {
type Value = Option<chrono::DateTime<Utc>>;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("an ISO 8601 formatted date string or null")
}
fn visit_str<E>(self, value: &str) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
match chrono::DateTime::parse_from_rfc3339(value) {
Ok(dt) => Ok(Some(dt.with_timezone(&Utc))),
Err(e) => Err(E::custom(format!("Failed to parse datetime: {}", e))),
}
}
fn visit_none<E>(self) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(None)
}
fn visit_some<D>(self, deserializer: D) -> Result<Self::Value, D::Error>
where
D: Deserializer<'de>,
{
deserializer.deserialize_str(self)
}
}
deserializer.deserialize_option(DateTimeVisitor)
}

View File

@@ -0,0 +1,70 @@
use crate::{datasources::projects::ProjectsDatasource, state::AppState, utils::datetime::*};
use axum::{extract::State, http::StatusCode, response::IntoResponse, routing::get, Json, Router};
use cache::Expiration;
use serde::{Deserialize, Serialize};
#[derive(sqlx::FromRow, Deserialize, Serialize, Debug, Clone)]
pub struct Project {
pub project_id: i32,
pub title: String,
pub repo: Option<String>,
pub summary: String,
pub tech: String,
pub wip: Option<bool>,
#[serde(serialize_with = "serialize_datetime")]
#[serde(deserialize_with = "deserialize_datetime")]
pub created_at: Option<chrono::DateTime<chrono::Utc>>,
}
pub struct ProjectsRoute;
impl ProjectsRoute {
pub fn routes(app_state: &AppState) -> Router {
Router::new()
.route("/projects", get(ProjectsRoute::get_all))
.with_state(app_state.clone())
}
async fn get_all(State(app_state): State<AppState>) -> impl IntoResponse {
let mut state = app_state.lock().await;
let cached: Option<Vec<Project>> = state
.cache
.get(String::from("projects:all"))
.await
.unwrap_or(None);
if let Some(projects) = cached {
tracing::info!("grabbing all projects from cache");
return Ok(Json(projects));
};
match ProjectsDatasource::get_all(&state.database).await {
Ok(projects) => {
tracing::info!("grabbing all projects from database");
if let p = &projects {
let projects = p.clone();
let state = app_state.clone();
tracing::info!("storing database data in cache");
tokio::spawn(async move {
let mut s = state.lock().await;
let _ = s
.cache
.set(
String::from("projects:all"),
&projects,
Some(Expiration::EX(10)),
None,
false,
)
.await;
});
};
Ok(Json(projects))
}
Err(e) => Err((StatusCode::INTERNAL_SERVER_ERROR, e.to_string())),
}
}
}

View File

@@ -8,15 +8,20 @@ use axum::{
pub struct RootRoute; pub struct RootRoute;
impl RootRoute { impl RootRoute {
pub fn routes() -> Router { pub fn routes() -> Router {
Router::new().route("/", get(RootRoute::root)) Router::new()
// .fallback(RootRoute::not_found) .route("/", get(RootRoute::root))
.route("/health", get(RootRoute::health))
} }
async fn root() -> Html<&'static str> { async fn root() -> Html<&'static str> {
Html("<p>Copyright Wyatt J. Miller 2024</p>") Html("<p>Copyright Wyatt J. Miller 2024</p>")
} }
async fn not_found() -> impl IntoResponse { pub async fn not_found() -> impl IntoResponse {
(StatusCode::NOT_FOUND, "¯\\_(ツ)_/¯") (StatusCode::NOT_FOUND, "¯\\_(ツ)_/¯")
} }
async fn health() -> impl IntoResponse {
StatusCode::OK
}
} }

View File

@@ -1,78 +1,17 @@
use fred::interfaces::KeysInterface;
use fred::{clients::Pool, prelude::*};
use sqlx::PgPool; use sqlx::PgPool;
pub type AppState = std::sync::Arc<tokio::sync::Mutex<AppInternalState>>; pub type AppState = std::sync::Arc<tokio::sync::Mutex<AppInternalState>>;
pub struct AppInternalState { pub struct AppInternalState {
pub database: sqlx::postgres::PgPool, pub database: sqlx::postgres::PgPool,
pub cache: Cache, pub cache: cache::Cache,
}
pub struct Cache {
pub inmem: Pool,
} }
impl AppInternalState { impl AppInternalState {
pub fn new(database: PgPool, cache: Pool) -> Self { pub fn new(database: PgPool, cache: cache::Pool) -> Self {
AppInternalState { AppInternalState {
database, database,
cache: Cache { inmem: cache }, cache: cache::Cache { inmem: cache },
} }
} }
} }
impl Cache {
pub async fn get<T>(&mut self, key: String) -> Result<Option<T>, Box<dyn std::error::Error>>
where
T: for<'de> serde::Deserialize<'de>,
{
if !self.inmem.is_connected() {
return Err(Box::new(std::io::Error::new(
std::io::ErrorKind::Other,
"Not connected to cache".to_string(),
)));
}
let value: Option<String> = self.inmem.get(&key).await?;
match value {
Some(json_str) => match serde_json::from_str::<T>(&json_str) {
Ok(deserialized) => Ok(Some(deserialized)),
Err(_) => Ok(None),
},
None => Ok(None),
}
}
pub async fn set<T>(
&mut self,
key: String,
contents: &T,
expiration: Option<Expiration>,
set_opts: Option<SetOptions>,
get: bool,
) -> Result<(), Box<dyn std::error::Error>>
where
T: for<'de> serde::Deserialize<'de> + serde::Serialize,
{
if !self.inmem.is_connected() {
return Err(Box::new(std::io::Error::new(
std::io::ErrorKind::Other,
"Not connected to cache".to_string(),
)));
}
let json_string = serde_json::to_string(contents)?;
self.inmem
.set(key, json_string, expiration, set_opts, get)
.await?;
Ok(())
}
pub async fn del(&mut self, key: String) -> Result<(), Box<dyn std::error::Error>> {
self.inmem.del(key).await?;
Ok(())
}
}

View File

@@ -0,0 +1,56 @@
use chrono::Utc;
use serde::{Deserializer, Serializer};
use std::fmt;
pub fn serialize_datetime<S>(
date: &Option<chrono::DateTime<Utc>>,
serializer: S,
) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
serializer.serialize_str(&date.unwrap().to_rfc3339())
}
pub fn deserialize_datetime<'de, D>(
deserializer: D,
) -> Result<Option<chrono::DateTime<Utc>>, D::Error>
where
D: Deserializer<'de>,
{
struct DateTimeVisitor;
impl<'de> serde::de::Visitor<'de> for DateTimeVisitor {
type Value = Option<chrono::DateTime<Utc>>;
fn expecting(&self, formatter: &mut fmt::Formatter) -> fmt::Result {
formatter.write_str("an ISO 8601 formatted date string or null")
}
fn visit_str<E>(self, value: &str) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
match chrono::DateTime::parse_from_rfc3339(value) {
Ok(dt) => Ok(Some(dt.with_timezone(&Utc))),
Err(e) => Err(E::custom(format!("Failed to parse datetime: {}", e))),
}
}
fn visit_none<E>(self) -> Result<Self::Value, E>
where
E: serde::de::Error,
{
Ok(None)
}
fn visit_some<D>(self, deserializer: D) -> Result<Self::Value, D::Error>
where
D: Deserializer<'de>,
{
deserializer.deserialize_str(self)
}
}
deserializer.deserialize_option(DateTimeVisitor)
}

View File

@@ -1 +1,4 @@
pub mod datetime;
pub mod pagination;
pub mod rss; pub mod rss;
pub mod sitemap;

View File

@@ -0,0 +1,13 @@
use serde::{Deserialize, Serialize};
#[derive(Deserialize, Serialize)]
pub struct PaginationQuery {
pub page: Option<i64>,
pub limit: Option<i64>,
}
#[derive(Deserialize, Serialize)]
pub struct Pagination {
pub page: i64,
pub limit: i64,
}

View File

@@ -13,7 +13,8 @@ pub struct RssEntry {
impl From<posts::Post> for RssEntry { impl From<posts::Post> for RssEntry {
fn from(post: posts::Post) -> Self { fn from(post: posts::Post) -> Self {
let web_url = std::env::var("BASE_URI_WEB").expect("Environment variable not found"); let web_url =
std::env::var("BASE_URI_WEB").expect("Environment variable BASE_URI_WEB not found");
let post_url = format!("{}{}{}", web_url, "/posts/", post.post_id.to_string()); let post_url = format!("{}{}{}", web_url, "/posts/", post.post_id.to_string());
let author_full_name = format!("{} {}", post.first_name.unwrap(), post.last_name.unwrap()); let author_full_name = format!("{} {}", post.first_name.unwrap(), post.last_name.unwrap());
@@ -58,10 +59,7 @@ pub fn generate_rss(
link: &str, link: &str,
posts: &HashMap<String, posts::Post>, posts: &HashMap<String, posts::Post>,
) -> String { ) -> String {
println!("{:?}", posts);
let values = posts.clone().into_values(); let values = posts.clone().into_values();
println!("{:?}", values);
let rss_entries = values let rss_entries = values
.map(|p| p.into()) .map(|p| p.into())
.map(|r: RssEntry| r.to_item()) .map(|r: RssEntry| r.to_item())
@@ -69,22 +67,24 @@ pub fn generate_rss(
let safe_title = escape_str_pcdata(title); let safe_title = escape_str_pcdata(title);
let safe_description = escape_str_pcdata(description); let safe_description = escape_str_pcdata(description);
println!("{:?}", rss_entries);
// TODO: change the atom link in this string - it's not correct
// change it when we know the URL
format!( format!(
r#"<?xml version="1.0" encoding="UTF-8"?> r#"<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"> <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel> <channel>
<title>{safe_title}</title> <title>{safe_title}</title>
<description>{safe_description}</description> <description>{safe_description}</description>
<link>{link}</link> <link>{link}</link>
<language>en-us</language> <language>en-us</language>
<ttl>60</ttl> <ttl>60</ttl>
<generator>Kyouma 1.0.0-SE</generator> <generator>Kyouma 1.0.0-SE</generator>
<atom:link href="https://wyattjmiller.com/posts.xml" rel="self" type="application/rss+xml" /> <atom:link href="https://wyattjmiller.com/posts.xml" rel="self" type="application/rss+xml" />
{} {}
</channel> </channel>
</rss>"#, </rss>
"#,
rss_entries rss_entries
) )
} }

View File

@@ -0,0 +1,61 @@
use std::collections::HashMap;
pub struct SitemapEntry {
pub location: String,
pub lastmod: chrono::DateTime<chrono::Utc>,
}
impl SitemapEntry {
fn to_item(&self) -> String {
format!(
r#"
<url>
<loc>{}</loc>
<lastmod>{}</lastmod>
</url>
"#,
self.location,
self.lastmod.to_rfc3339(),
)
}
}
pub fn generate_sitemap(entries: &HashMap<String, SitemapEntry>) -> String {
let urls = entries
.values()
.map(|entry| entry.to_item())
.collect::<String>();
format!(
r#"
<!-- Generated by Kyouma 1.0.0-SE -->
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
{}
</urlset>
"#,
urls
)
}
pub fn get_static_pages(entries: &mut HashMap<String, SitemapEntry>, web_url: &String) {
entries.insert(
"10000".to_string(),
SitemapEntry {
location: web_url.clone(),
lastmod: chrono::Utc::now(),
},
);
entries.insert(
"10001".to_string(),
SitemapEntry {
location: format!("{}/posts", web_url),
lastmod: chrono::Utc::now(),
},
);
entries.insert(
"10002".to_string(),
SitemapEntry {
location: format!("{}/projects", web_url),
lastmod: chrono::Utc::now(),
},
);
}

2
backend/storage/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
target/
.env

3589
backend/storage/Cargo.lock generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,15 @@
[package]
name = "storage"
description = "Internal object storage library"
version = "0.1.0"
edition = "2024"
[dependencies]
aws-sdk-s3 = "1.94.0"
aws-config = "1.8"
azure_core = "0.25.0"
azure_storage = "0.21.0"
azure_storage_blobs = "0.21.0"
async-trait = "0.1"
tokio = { version = "1.0", features = ["full"] }
thiserror = "2.0.12"

View File

@@ -0,0 +1,7 @@
# Storage library
also known as `storage`
## What is this?
An internal storage library. This was needed because both `public` and `task` needed storage functionality. Additionally, this helps maintainability and avoids duplicate code.

View File

@@ -0,0 +1,11 @@
use azure_core::error::HttpError;
#[derive(Debug, thiserror::Error)]
pub enum AdapterError {
#[error("Azure error: {0}")]
Azure(#[from] azure_core::Error),
#[error("HTTP error: {0}")]
Http(#[from] HttpError),
#[error("Serialization error: {0}")]
Serialization(String),
}

View File

@@ -0,0 +1,2 @@
pub mod error;
pub mod services;

View File

@@ -0,0 +1,126 @@
use crate::{error::AdapterError, services::ObjectStorageClient};
use async_trait::async_trait;
use aws_config::{BehaviorVersion, Region};
use aws_sdk_s3::{Client, Config, config::Credentials};
use std::env;
#[derive(Clone, Debug)]
pub struct S3ClientConfig {
pub access_key: String,
secret_key: String,
endpoint: String,
pub bucket: String,
region: String,
}
#[derive(Clone)]
pub struct S3Client {
client: Client,
pub client_config: S3ClientConfig,
}
impl S3ClientConfig {
pub fn new(
access_key: &str,
secret_key: &str,
endpoint: &str,
bucket: &str,
region: &str,
) -> Result<Self, Box<dyn std::error::Error>> {
Ok(S3ClientConfig {
access_key: access_key.to_owned(),
secret_key: secret_key.to_owned(),
endpoint: endpoint.to_owned(),
bucket: bucket.to_owned(),
region: region.to_owned(),
})
}
pub fn from_env() -> Result<Self, Box<dyn std::error::Error>> {
Ok(S3ClientConfig {
access_key: env::var("S3_ACCESS_KEY")
.map_err(|_| "S3_ACCESS_KEY environment variable not set")?,
secret_key: env::var("S3_SECRET_KEY")
.map_err(|_| "S3_SECRET_KEY environment variable not set")?,
endpoint: env::var("S3_ENDPOINT")
.unwrap_or_else(|_| "us-ord-1.linodeobjects.com".to_string()),
bucket: env::var("S3_BUCKET").map_err(|_| "S3_BUCKET environment variable not set")?,
region: env::var("S3_REGION").unwrap_or_else(|_| "us-ord".to_string()),
})
}
}
impl S3Client {
pub fn new(config: &S3ClientConfig) -> Self {
let credentials = Credentials::new(
&config.access_key,
&config.secret_key,
None,
None,
"linode-object-storage",
);
let s3_config = Config::builder()
.behavior_version(BehaviorVersion::latest())
.region(Region::new(config.region.clone()))
.endpoint_url(format!("https://{}", config.endpoint))
.credentials_provider(credentials)
.force_path_style(false)
.build();
Self {
client: Client::from_conf(s3_config),
client_config: config.clone(),
}
}
}
#[async_trait]
impl ObjectStorageClient for S3Client {
type Error = AdapterError;
async fn put_object(&self, bucket: &str, key: &str, data: Vec<u8>) -> Result<(), Self::Error> {
println!("Uploading to S3 (or S3 like) Object Storage...");
println!("Bucket: {}", bucket);
let _ = self
.client
.put_object()
.bucket(bucket)
.key(key)
.body(data.into())
.acl(aws_sdk_s3::types::ObjectCannedAcl::PublicRead)
.content_type("application/xml")
.send()
.await
.unwrap();
Ok(())
}
async fn get_object(&self, _bucket: &str, _key: &str) -> Result<Vec<u8>, Self::Error> {
todo!("not impl")
}
async fn delete_object(&self, _bucket: &str, _key: &str) -> Result<(), Self::Error> {
todo!("not impl")
}
async fn list_objects(
&self,
_bucket: &str,
_prefix: Option<&str>,
) -> Result<Vec<String>, Self::Error> {
todo!("not impl")
}
async fn object_exists(&self, _bucket: &str, _key: &str) -> Result<bool, Self::Error> {
todo!("not impl")
}
}
impl Default for S3ClientConfig {
fn default() -> Self {
S3ClientConfig::from_env().unwrap()
}
}

View File

@@ -0,0 +1,71 @@
use crate::error::AdapterError;
use async_trait::async_trait;
use azure_storage::prelude::*;
use azure_storage_blobs::prelude::*;
use super::ObjectStorageClient;
pub struct AzureBlobClient {
client: BlobServiceClient,
}
impl AzureBlobClient {
pub fn new(account_name: &str, account_key: String) -> Self {
let storage_credentials = StorageCredentials::access_key(account_name, account_key);
let client = BlobServiceClient::new(account_name, storage_credentials);
Self { client }
}
// Helper method to get container client
fn get_container_client(&self, container_name: &str) -> ContainerClient {
self.client.container_client(container_name)
}
// Helper method to get blob client
fn get_blob_client(&self, container_name: &str, blob_name: &str) -> BlobClient {
self.get_container_client(container_name)
.blob_client(blob_name)
}
}
#[async_trait]
impl ObjectStorageClient for AzureBlobClient {
type Error = AdapterError;
async fn put_object(
&self,
bucket: &str, // container name
key: &str, // blob name
data: Vec<u8>,
) -> Result<(), Self::Error> {
let blob_client = self.get_blob_client(bucket, key);
let _request = blob_client.put_block_blob(data).await.unwrap();
Ok(())
}
async fn get_object(&self, bucket: &str, key: &str) -> Result<Vec<u8>, Self::Error> {
let blob_client = self.get_blob_client(bucket, key);
let response = blob_client.get_content().await.unwrap();
Ok(response)
}
async fn delete_object(&self, bucket: &str, key: &str) -> Result<(), Self::Error> {
let blob_client = self.get_blob_client(bucket, key);
blob_client.delete().await.unwrap();
Ok(())
}
async fn list_objects(
&self,
_bucket: &str,
_prefix: Option<&str>,
) -> Result<Vec<String>, Self::Error> {
todo!("not impl")
}
async fn object_exists(&self, _bucket: &str, _key: &str) -> Result<bool, Self::Error> {
todo!("not impl")
}
}

View File

@@ -0,0 +1,23 @@
pub mod aws;
pub mod azure;
use async_trait::async_trait;
#[async_trait]
pub trait ObjectStorageClient {
type Error;
async fn put_object(&self, bucket: &str, key: &str, data: Vec<u8>) -> Result<(), Self::Error>;
async fn get_object(&self, bucket: &str, key: &str) -> Result<Vec<u8>, Self::Error>;
async fn delete_object(&self, bucket: &str, key: &str) -> Result<(), Self::Error>;
async fn list_objects(
&self,
bucket: &str,
prefix: Option<&str>,
) -> Result<Vec<String>, Self::Error>;
async fn object_exists(&self, bucket: &str, key: &str) -> Result<bool, Self::Error>;
}

View File

@@ -1 +0,0 @@
DATABASE_URL=postgres://wyatt:wyattisawesome@localhost:5432/postgres

View File

@@ -0,0 +1,46 @@
{
"db_name": "PostgreSQL",
"query": "INSERT INTO logs (task_id, created_at, task_status) VALUES ($1, now(), 'pending') RETURNING task_id, log_id, created_at, task_status, finished_at",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "task_id",
"type_info": "Int4"
},
{
"ordinal": 1,
"name": "log_id",
"type_info": "Int4"
},
{
"ordinal": 2,
"name": "created_at",
"type_info": "Timestamptz"
},
{
"ordinal": 3,
"name": "task_status",
"type_info": "Text"
},
{
"ordinal": 4,
"name": "finished_at",
"type_info": "Timestamptz"
}
],
"parameters": {
"Left": [
"Int4"
]
},
"nullable": [
false,
false,
false,
false,
true
]
},
"hash": "364c58ab7678af9d36003af9858e69b876be3939a4d9f34a95950ab7cc166778"
}

View File

@@ -0,0 +1,22 @@
{
"db_name": "PostgreSQL",
"query": "SELECT EXISTS(SELECT 1 FROM posts p WHERE p.filename = $1) as filename",
"describe": {
"columns": [
{
"ordinal": 0,
"name": "filename",
"type_info": "Bool"
}
],
"parameters": {
"Left": [
"Text"
]
},
"nullable": [
null
]
},
"hash": "723a24f681b1b7866e4a2636ddda2bb8ed78d60540158ffa0fbebba4bdbfa2b9"
}

View File

@@ -0,0 +1,15 @@
{
"db_name": "PostgreSQL",
"query": "UPDATE logs SET task_status = $1 WHERE task_id = $2",
"describe": {
"columns": [],
"parameters": {
"Left": [
"Text",
"Int4"
]
},
"nullable": []
},
"hash": "e3f9cdc6fede1a8601c3775e829f04eef5b00cf7bc5a087b5ba5c70f99e76763"
}

3582
backend/task/Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -6,7 +6,10 @@ edition = "2021"
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies] [dependencies]
storage = { version = "0.1.0", path = "../storage" }
cache = { version = "0.1.0", path = "../cache" }
tokio = { version = "1.19.2", features = ["full"] } tokio = { version = "1.19.2", features = ["full"] }
reqwest = { version = "0.12.20", features = ["json", "rustls-tls"] }
job_scheduler = "1.2.1" job_scheduler = "1.2.1"
sqlx = { version = "0.8.2", features = [ sqlx = { version = "0.8.2", features = [
"postgres", "postgres",
@@ -18,5 +21,7 @@ once_cell = "1.19.0"
dotenvy = "0.15.7" dotenvy = "0.15.7"
futures = "0.3.30" futures = "0.3.30"
markdown = "1.0.0-alpha.20" markdown = "1.0.0-alpha.20"
serde = {version = "*", features = ["derive"]} serde = { version = "*", features = ["derive"] }
serde_yaml = "*" serde_yml = "*"
tracing = "0.1"
tracing-subscriber = { version = "0.3.18", features = ["env-filter"] }

14
backend/task/Dockerfile Normal file
View File

@@ -0,0 +1,14 @@
FROM rust:1.88.0
WORKDIR /app
COPY ./task ./task
COPY ./cache ./cache
COPY ./storage ./storage
RUN mkdir /app/posts
RUN cargo build --release --manifest-path ./task/Cargo.toml
EXPOSE 3000
CMD ["/app/task/target/release/task"]

View File

@@ -4,4 +4,17 @@ also known as `task`
## What is this? ## What is this?
I don't know yet - hopefully this will be filled out soon. This is a task runner/scheduler programs that will fire off various tasks. These tasks can be anything from an blog post import task to a RSS generator task. Additionally, there is task logs inside the database so that you can keep track of tasks when something goes wrong.
## Things you should know
`task` uses a `.env` file at the root of the project. The file takes standard environment variables (like enviroment variables you would put into a `.bashrc` or ad-hoc into your shell).
For `task` to work properly, please make sure to first create the `.env` file, then fill out the following environment variables:
- `BASE_URI_API` - needed for communicating with `public`
- `DATABASE_URL` - needed for communicating to Postgres
- `REDIS_URL` - needed for communicating with the cache (Redis or Valkey)
- `S3_ACCESS_KEY` - needed for Amazon S3 (or compatible services) storage
- `S3_SECRET_KEY` - needed for Amazon S3 (or compatible services) storage
- `S3_BUCKET` - needed for Amazon S3 (or compatible services) storage

View File

@@ -0,0 +1,11 @@
use std::path::PathBuf;
pub fn config() -> Configuration {
Configuration {
env: dotenvy::dotenv(),
}
}
pub struct Configuration {
env: Result<PathBuf, dotenvy::Error>,
}

View File

@@ -1,16 +1,20 @@
use cache::ClientLike;
use chrono::Utc; use chrono::Utc;
use sqlx::{postgres::PgPoolOptions, Pool, Postgres}; use sqlx::{postgres::PgPoolOptions, Pool, Postgres};
use std::env; use std::env;
use std::sync::Arc; use std::sync::Arc;
use std::time::Duration; use std::time::Duration;
use tasks::import_posts; use storage::services::aws;
use tasks::*;
//mod config; mod config;
mod tasks; mod tasks;
mod utils; mod utils;
pub struct TaskManager<'a> { pub struct TaskManager<'a> {
pool: Pool<Postgres>, pool: Pool<Postgres>,
cache: cache::Pool,
s3_client: aws::S3Client,
jobs: Vec<TaskJob>, jobs: Vec<TaskJob>,
last_activated: Option<chrono::DateTime<Utc>>, last_activated: Option<chrono::DateTime<Utc>>,
last_job: Option<TaskJob>, last_job: Option<TaskJob>,
@@ -29,6 +33,7 @@ pub struct TaskLog {
#[derive(Debug)] #[derive(Debug)]
pub enum TaskStatus { pub enum TaskStatus {
Pending(String), Pending(String),
Running(String),
Completed(String), Completed(String),
Failed(String), Failed(String),
} }
@@ -47,7 +52,9 @@ pub struct TaskJob {
async fn main() { async fn main() {
println!("Hello, world!"); println!("Hello, world!");
dotenvy::dotenv().unwrap(); let _ = config::config();
// setup database
let database_url = let database_url =
env::var("DATABASE_URL").expect("Environment variable DATABASE_URL is not found"); env::var("DATABASE_URL").expect("Environment variable DATABASE_URL is not found");
let pool = PgPoolOptions::new() let pool = PgPoolOptions::new()
@@ -57,8 +64,36 @@ async fn main() {
.await .await
.expect("Failed to connect to the database"); .expect("Failed to connect to the database");
let mut manager = TaskManager::new(pool); // setup redis/valkey
manager.register_jobs().await; let redis_url = match std::env::var("REDIS_URL").unwrap().as_str() {
// TODO: fix the unwrap ^
"" => "redis://localhost:6379".to_string(),
x => x.to_string(),
};
let pool_size = 8;
let config = cache::Config::from_url(&redis_url).unwrap(); // TODO: fix the unwrap <<<
let redis_pool = cache::Builder::from_config(config)
.with_performance_config(|config| {
config.default_command_timeout = Duration::from_secs(60);
})
.set_policy(cache::ReconnectPolicy::new_exponential(0, 100, 30_000, 2))
.build_pool(pool_size)
.expect("Failed to create cache pool");
if std::env::var("REDIS_URL").unwrap() != "" {
// TODO: fix the unwrap ^
redis_pool.init().await.expect("Failed to connect to cache");
let _ = redis_pool.flushall::<i32>(false).await;
}
// setup storage
let s3_client_config = aws::S3ClientConfig::from_env().unwrap();
let s3_client = aws::S3Client::new(&s3_client_config);
let mut manager = TaskManager::new(pool, redis_pool, s3_client);
manager.register_jobs().await.unwrap();
loop { loop {
manager.scheduler.tick(); manager.scheduler.tick();
@@ -67,9 +102,11 @@ async fn main() {
} }
impl<'a> TaskManager<'a> { impl<'a> TaskManager<'a> {
fn new(pool: Pool<Postgres>) -> Self { fn new(pool: Pool<Postgres>, cache: cache::Pool, s3_client: aws::S3Client) -> Self {
TaskManager { TaskManager {
pool, pool,
cache,
s3_client,
jobs: Vec::new(), jobs: Vec::new(),
last_activated: None, last_activated: None,
last_job: None, last_job: None,
@@ -77,23 +114,44 @@ impl<'a> TaskManager<'a> {
} }
} }
pub async fn register_jobs(&self) { pub async fn register_jobs(&mut self) -> Result<(), Box<dyn std::error::Error>> {
// let jobs: Vec<Job> = Vec::new();
let results = sqlx::query_as::<_, TaskJob>("SELECT task_id, task_name, schedule, is_active, created_at, deleted_at FROM tasks WHERE is_active = true AND deleted_at IS NULL") let results = sqlx::query_as::<_, TaskJob>("SELECT task_id, task_name, schedule, is_active, created_at, deleted_at FROM tasks WHERE is_active = true AND deleted_at IS NULL")
.fetch_all(&self.pool) .fetch_all(&self.pool)
.await .await?;
.unwrap();
let mut scheduler = job_scheduler::JobScheduler::new(); tracing::info!("Found {} active jobs to register", results.len());
results.iter().for_each(|r| {
println!("Registering job: {:?}", r.task_name);
let job: _ = job_scheduler::Job::new(r.schedule.parse().unwrap(), || match r.task_id { for job in &results {
1 => import_posts::register(&Arc::new(&self.pool)), tracing::info!("Registering job: {}", job.task_name);
_ => panic!(),
});
scheduler.add(job); let schedule = job
}); .schedule
.parse()
.map_err(|e| format!("Failed to parse schedule '{}': {}", job.schedule, e))?;
let task: Box<dyn Fn() + Send + Sync> = match job.task_id {
1 => {
let pool = Arc::new(self.pool.clone());
Box::new(move || import_posts::register(&pool))
}
2 => {
let pool = Arc::new(self.pool.clone());
let cache = Arc::new(self.cache.clone());
let s3_client = Arc::new(self.s3_client.clone());
Box::new(move || upload_rss::register(&pool, &cache, &s3_client))
}
3 => {
let pool = Arc::new(self.pool.clone());
let cache = Arc::new(self.cache.clone());
let s3_client = Arc::new(self.s3_client.clone());
Box::new(move || upload_sitemap::register(&pool, &cache, &s3_client))
}
id => return Err(format!("Unknown task_id: {}", id).into()),
};
self.scheduler.add(job_scheduler::Job::new(schedule, task));
}
Ok(())
} }
} }

View File

@@ -1,81 +1,110 @@
use std::fs; use std::fs;
use std::path; use std::io::Read;
use crate::utils::task_log; use crate::utils::task_log;
use chrono::{DateTime, FixedOffset, Utc};
use serde::{Deserialize, Deserializer}; use serde::{Deserialize, Deserializer};
pub fn register(pool: &sqlx::Pool<sqlx::Postgres>) { pub fn register(pool: &sqlx::Pool<sqlx::Postgres>) {
let p = pool.clone(); let p = pool.clone();
tokio::spawn(async move { tokio::spawn(async move {
import_posts("/app", &p).await; let _ = import_posts("/app/posts", &p).await;
}); });
} }
async fn import_posts(dir_path: &str, pool: &sqlx::Pool<sqlx::Postgres>) { async fn import_posts(
println!("hello from import_posts"); dir_path: &str,
let task = task_log::start(1, pool).await.unwrap(); pool: &sqlx::Pool<sqlx::Postgres>,
let entries = fs::read_dir(dir_path).unwrap(); ) -> Result<(), Box<dyn std::error::Error>> {
println!("Beginning post import process");
// Start task logging
let task = task_log::start(1, pool).await?;
let options = MarkdownOptions { let options = MarkdownOptions {
options: markdown::Constructs::gfm(), options: markdown::Constructs::gfm(),
}; };
let entries = fs::read_dir(dir_path)?;
for f in entries { for entry_result in entries {
let file = f.unwrap(); let file = entry_result?;
let file_path = file.path(); let file_path = file.path();
if file_path.is_file() {
let file_name = file.file_name();
let file_name_final = &file_name.to_str().unwrap();
let exists = sqlx::query_as::<_, FilenameExists>(
"SELECT EXISTS(SELECT 1 FROM posts WHERE filename = $1)",
)
.bind(file_name_final)
.fetch_one(pool)
.await
.unwrap()
.filename;
if !exists.is_empty() { if !file_path.is_file() {
println!( continue;
"File does not exist! Inserting: {:?}", }
file_path.file_name()
);
let file_md_contents = process_read_file(file_path, &options);
let content = markdown::to_html(&file_md_contents);
let metadata =
crate::utils::front_matter::YamlFrontMatter::parse::<MarkdownMetadata>(
&content,
)
.unwrap();
let title = metadata.metadata.title;
sqlx::query_as::<_, InsertPosts>( let file_name = file.file_name();
"INSERT INTO posts (title, body, filename, author_id) VALUES ($1, $2, $3, $4) RETURNING (title, body, filename, author_id)", let file_name_str = match file_name.to_str() {
) Some(name) => name,
.bind(title) None => {
.bind(content) eprintln!("Skipping file with non-UTF8 filename: {:?}", file_path);
.bind(file_name_final) continue;
.bind(1)
.fetch_one(pool)
.await
.unwrap();
} }
};
println!("Processing file: {}", file_name_str);
// Check if file already exists in database
let exists_query = sqlx::query_as!(
FilenameExists,
"SELECT EXISTS(SELECT 1 FROM posts p WHERE p.filename = $1) as filename",
file_name_str
)
.fetch_one(pool)
.await?;
// Skip if file already exists in database
if !exists_query.filename.unwrap_or(false) {
println!("Importing new file: {}", file_name_str);
// Process file contents
let file_md_contents = process_read_file(&file_path)?;
// Extract metadata
let document = crate::utils::front_matter::YamlFrontMatter::parse::<MarkdownMetadata>(
&file_md_contents,
)?;
let content =
markdown::to_html_with_options(&document.content, &markdown::Options::default());
let title = document.metadata.title;
let pub_date =
DateTime::parse_from_str(document.metadata.date.as_ref(), "%Y-%m-%d %H:%M:%S %z")?;
let content_final = content.unwrap();
// Insert into database
let results = sqlx::query_as::<_, InsertPosts>(
"INSERT INTO posts (title, body, filename, publish_date, author_id) VALUES ($1, $2, $3, $4, $5) RETURNING title, body, filename, author_id"
)
.bind(title)
.bind(content_final)
.bind(file_name_str)
.bind(pub_date)
.bind(1) // Consider making author_id a parameter
.fetch_one(pool)
.await?;
println!("Successfully imported: {}", file_name_str);
} else {
println!("Skipping existing file: {}", file_name_str);
} }
} }
task_log::update(task.task_id, String::from("Completed"), pool) // Mark task as completed
.await task_log::update(task.task_id, String::from("Completed"), pool).await?;
.unwrap();
Ok(())
} }
fn process_read_file<P: AsRef<path::Path>>(path: P, md_opts: &MarkdownOptions) -> String { fn process_read_file(file_path: &std::path::Path) -> Result<String, std::io::Error> {
let file_contents = fs::read_to_string(path).unwrap(); let mut file = std::fs::read_to_string(file_path)?;
markdown::to_html(file_contents.as_str())
Ok(file)
} }
#[derive(Debug, sqlx::FromRow)] #[derive(Debug, sqlx::FromRow)]
struct FilenameExists { struct FilenameExists {
filename: String, filename: Option<bool>,
} }
#[derive(Debug, sqlx::FromRow)] #[derive(Debug, sqlx::FromRow)]
@@ -90,7 +119,7 @@ struct MarkdownOptions {
options: markdown::Constructs, options: markdown::Constructs,
} }
#[derive(Deserialize)] #[derive(Deserialize, Debug)]
struct MarkdownMetadata { struct MarkdownMetadata {
layout: String, layout: String,
title: String, title: String,

View File

@@ -1 +1,3 @@
pub mod import_posts; pub mod import_posts;
pub mod upload_rss;
pub mod upload_sitemap;

View File

@@ -0,0 +1,68 @@
use crate::utils::{
request::{Request, Response},
task_log,
};
use cache::KeysInterface;
use storage::services::{aws::S3Client, ObjectStorageClient};
pub fn register(pool: &sqlx::Pool<sqlx::Postgres>, cache: &cache::Pool, s3_client: &S3Client) {
let p = pool.clone();
let c = cache.clone();
let s3 = s3_client.to_owned();
tokio::spawn(async move {
let _ = upload_rss(&p, &c, s3).await;
});
}
async fn upload_rss(
pool: &sqlx::Pool<sqlx::Postgres>,
cache: &cache::Pool,
s3_client: S3Client,
) -> Result<(), Box<dyn std::error::Error>> {
// start task logging
task_log::start(2, pool).await?;
// get request and request the things
let request = Request::new();
let rss_url = format!("{}/posts/rss", request.base_url);
let rss_result = request.request_url::<String>(&rss_url).await.unwrap();
// upload the sucker to obj storage
if let Response::Xml(rss) = rss_result {
let cached: &Option<String> = &cache.get(String::from("rss")).await.unwrap_or(None);
let cache_clone = cache.clone();
if let Some(cached_value) = cached {
if *cached_value == rss {
println!("Response is the same in the cache, exiting");
return Ok(());
}
}
let r = rss.clone();
let _ = s3_client
.put_object(
s3_client.client_config.bucket.as_str(),
"feed.xml",
rss.as_bytes().to_vec(),
)
.await?;
tokio::spawn(async move {
cache_clone
.set::<String, String, &String>(
String::from("rss"),
&r,
Some(cache::Expiration::EX(3600)),
None,
false,
)
.await
.unwrap();
});
println!("Finished uploading RSS feed");
}
Ok(())
}

View File

@@ -0,0 +1,67 @@
use crate::utils::{
request::{Request, Response},
task_log,
};
use cache::KeysInterface;
use storage::services::{aws::S3Client, ObjectStorageClient};
pub fn register(pool: &sqlx::Pool<sqlx::Postgres>, cache: &cache::Pool, s3_client: &S3Client) {
let p = pool.clone();
let c = cache.clone();
let s3 = s3_client.to_owned();
tokio::spawn(async move {
let _ = upload_sitemap(&p, &c, s3).await;
});
}
async fn upload_sitemap(
pool: &sqlx::Pool<sqlx::Postgres>,
cache: &cache::Pool,
s3_client: S3Client,
) -> Result<(), Box<dyn std::error::Error>> {
// start task logging
task_log::start(3, pool).await?;
// get request and request the things
let request = Request::new();
let sitemap_url = format!("{}/posts/sitemap", request.base_url);
let sitemap_result = request.request_url::<String>(&sitemap_url).await.unwrap();
// upload the sucker to obj storage
if let Response::Xml(sitemap) = sitemap_result {
let cached: &Option<String> = &cache.get(String::from("sitemap")).await.unwrap_or(None);
let cache_clone = cache.clone();
if let Some(cached_value) = cached {
if *cached_value == sitemap {
println!("Response is the same in the cache, exiting");
return Ok(());
}
}
let s = sitemap.clone();
let _ = s3_client
.put_object(
s3_client.client_config.bucket.as_str(),
"sitemap.xml",
sitemap.as_bytes().to_vec(),
)
.await?;
tokio::spawn(async move {
cache_clone
.set::<String, String, &String>(
String::from("sitemap"),
&s,
Some(cache::Expiration::EX(3600)),
None,
false,
)
.await
.unwrap();
});
println!("Finished uploading sitemap!");
}
Ok(())
}

View File

@@ -1,18 +1,34 @@
// derived from https://github.com/EstebanBorai/yaml-front-matter // derived from https://github.com/EstebanBorai/yaml-front-matter
use serde::de::DeserializeOwned; use serde::de::DeserializeOwned;
pub struct Document<T: DeserializeOwned> { #[derive(Debug)]
pub metadata: T, pub struct Document {
pub metadata: FrontMatter,
pub content: String, pub content: String,
} }
#[derive(Debug)]
pub struct FrontMatter {
pub layout: String,
pub title: String,
pub date: String,
pub published: bool,
}
pub struct YamlFrontMatter; pub struct YamlFrontMatter;
impl YamlFrontMatter { impl YamlFrontMatter {
pub fn parse<T: DeserializeOwned>( pub fn parse<T: DeserializeOwned>(
markdown: &str, markdown: &str,
) -> Result<Document<T>, Box<dyn std::error::Error>> { ) -> Result<Document, Box<dyn std::error::Error>> {
let yaml = YamlFrontMatter::extract(markdown)?; let yaml = YamlFrontMatter::extract(markdown)?;
let metadata = serde_yaml::from_str::<T>(yaml.0.as_str())?; let clean_yaml = YamlFrontMatter::unescape_str(&yaml.0);
let metadata = match YamlFrontMatter::from_yaml_str(clean_yaml.as_str()) {
Ok(m) => m,
Err(e) => {
println!("{e}");
panic!();
}
};
Ok(Document { Ok(Document {
metadata, metadata,
@@ -52,4 +68,47 @@ impl YamlFrontMatter {
.join("\n"), .join("\n"),
)) ))
} }
fn unescape_str(s: &str) -> String {
s.replace("\\n", "\n")
.replace("\\\"", "\"")
.replace("\\\\", "\\")
// .replace("\\t", "\t")
// .replace("\\r", "\r")
}
fn from_yaml_str(yaml: &str) -> Result<FrontMatter, String> {
let mut layout = String::new();
let mut title = String::new();
let mut date = String::new();
let mut published = false;
for line in yaml.lines() {
let line = line.trim();
if let Some((key, value)) = line.split_once(':') {
let key = key.trim();
let value = value.trim();
match key {
"layout" => layout = value.to_string(),
"title" => {
// Remove quotes if present
title = value.trim_matches('\'').trim_matches('"').to_string();
}
"date" => date = value.to_string(),
"published" => {
published = value.parse().map_err(|_| "Invalid boolean for published")?;
}
_ => {} // Ignore unknown fields
}
}
}
Ok(FrontMatter {
layout,
title,
date,
published,
})
}
} }

View File

@@ -1,2 +1,4 @@
pub mod front_matter; pub mod front_matter;
pub mod request;
pub mod task_log; pub mod task_log;
pub mod upload;

View File

@@ -0,0 +1,85 @@
use reqwest::StatusCode;
use std::env;
use std::time::Duration;
#[derive(Debug)]
pub struct Request<'a> {
pub client: reqwest::Client,
pub base_url: Box<str>,
pub full_url: Option<&'a str>,
}
#[derive(Debug)]
pub enum Response<T> {
Json(T),
Xml(String),
Text(String),
Bytes(Vec<u8>),
}
impl<'a> Request<'a> {
pub fn new() -> Self {
Request {
client: reqwest::ClientBuilder::new()
.use_rustls_tls()
.timeout(Duration::from_secs(30))
.build()
.unwrap(),
base_url: env::var("BASE_URI_API")
.expect("Environment variable BASE_URI_API is not found")
.into_boxed_str(),
full_url: None,
}
}
pub async fn request_url<T>(
&self,
url: &String,
) -> Result<Response<T>, Box<dyn std::error::Error>>
where
T: for<'de> serde::Deserialize<'de>,
{
println!("{}", url);
let api_result = match self.client.get(url).send().await {
Ok(r) => r,
Err(e) => return Err(Box::new(e)),
};
match api_result.status() {
StatusCode::OK => {
// TODO: handle errors here
let content_type = api_result
.headers()
.get("content-type")
.and_then(|v| v.to_str().ok())
.unwrap();
if content_type.contains("application/json") {
match api_result.json::<T>().await {
Ok(j) => Ok(Response::Json(j)),
Err(e) => return Err(Box::new(e)),
}
} else if content_type.contains("application/xml") {
match api_result.text().await {
Ok(x) => Ok(Response::Xml(x)),
Err(e) => return Err(Box::new(e)),
}
} else if content_type.starts_with("text/") {
match api_result.text().await {
Ok(t) => Ok(Response::Text(t)),
Err(e) => return Err(Box::new(e)),
}
} else {
match api_result.bytes().await {
Ok(b) => Ok(Response::Bytes(b.to_vec())),
Err(e) => Err(Box::new(e)),
}
}
}
status => Err(Box::new(std::io::Error::new(
std::io::ErrorKind::Other,
format!("Unexpected status code: {}", status),
))),
}
}
}

View File

View File

@@ -1,16 +0,0 @@
layout {
pane {
pane
pane split_direction="horizontal" {
pane
pane
}
}
}
keybinds {
unbind "Ctrl s"
}
theme "catppuccin-mocha"

137
docker-compose.yaml Normal file
View File

@@ -0,0 +1,137 @@
version: "3.8"
services:
valkey-mywebsite:
image: valkey/valkey:8.0.2
container_name: valkey-mywebsite
ports:
- "6379:6379"
volumes:
- valkey_data:/data
restart: unless-stopped
networks:
- app_network
healthcheck:
test: ["CMD", "valkey-cli", "ping"]
interval: 30s
timeout: 10s
retries: 3
postgres-mywebsite:
image: postgres:16
container_name: postgres-mywebsite
# fill these in with postgres env vars
environment:
POSTGRES_USER: wyatt
POSTGRES_PASSWORD: test # <<< replace this
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
- ./init-db:/docker-entrypoint-initdb.d
restart: unless-stopped
networks:
- app_network
healthcheck:
test: ["CMD-SHELL", "pg_isready -U wyatt -d postgres"]
interval: 30s
timeout: 10s
retries: 3
frontend:
image: scm.wyattjmiller.com/wymiller/my-website-v2_frontend:master
container_name: frontend
ports:
- "8000:8000"
# fill these in the frontend env vars for prod
environment:
- BASE_URI_API=
- BASE_URI_WEB=
- EMAIL_FORM=
- RSS_URI=
- SITEMAP_URI=
- VIRTUAL_HOST=wyattjmiller.com
- VIRTUAL_PORT=80
- LETSENCRYPT_HOST=wyattjmiller.com
- LETSENCRYPT_EMAIL=wjmiller2016@gmail.com
volumes:
- ./deno-fresh-app:/app
- /app/node_modules
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
restart: unless-stopped
networks:
- app_network
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8000/health"]
interval: 30s
timeout: 10s
retries: 3
public-mywebsite:
image: scm.wyattjmiller.com/wymiller/my-website-v2_public:master
container_name: public-mywebsite
ports:
- "3000:3000"
# fill these in with public env vars for prod
environment:
- DATABASE_URL=
- REDIS_URL=
- BASE_URI_WEB=
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
restart: unless-stopped
networks:
- app_network
# make sure to change the url too
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:8080/health"]
interval: 30s
timeout: 10s
retries: 3
task-mywebsite:
image: scm.wyattjmiller.com/wymiller/my-website-v2_task:master
container_name: task-mywebsite
# fill these in with task env vars for prod
environment:
- DATABASE_URL=
- BASE_URI_API=
- S3_ACCESS_KEY=
- S3_SECRET_KEY=
- S3_BUCKET=
- REDIS_URL=
depends_on:
postgres:
condition: service_healthy
valkey:
condition: service_healthy
volumes:
- ./backend/task/app:/app/posts # <<< place all markdown files here
restart: unless-stopped
networks:
- app_network
healthcheck:
test: ["CMD", "pgrep", "-f", "task-mywebsite"]
interval: 30s
timeout: 10s
retries: 3
networks:
app_network:
driver: bridge
ipam:
config:
- subnet: 172.20.0.0/16
volumes:
valkey_mywebsite_data:
driver: local
postgres_mywebsite_data:
driver: local

22
flake.lock generated
View File

@@ -21,11 +21,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1729525729, "lastModified": 1748076591,
"narHash": "sha256-YiooFGeR7+sXSkHNfSzT8GQf+xtzbDwUbfbwkCCyuUs=", "narHash": "sha256-zfcYlOBYGfp4uxPC9ctaWf37bjZagbQ0pw7mqgTqfBI=",
"owner": "nekowinston", "owner": "nekowinston",
"repo": "nix-deno", "repo": "nix-deno",
"rev": "e92687492a4faec48ab1eb45adbdba30c876b0e5", "rev": "0b22de7dd34c7d7c7cd46cedee0b65592dc57d3e",
"type": "github" "type": "github"
}, },
"original": { "original": {
@@ -36,12 +36,12 @@
}, },
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1741862977, "lastModified": 1752436162,
"narHash": "sha256-prZ0M8vE/ghRGGZcflvxCu40ObKaB+ikn74/xQoNrGQ=", "narHash": "sha256-Kt1UIPi7kZqkSc5HVj6UY5YLHHEzPBkgpNUByuyxtlw=",
"rev": "cdd2ef009676ac92b715ff26630164bb88fec4e0", "rev": "dfcd5b901dbab46c9c6e80b265648481aafb01f8",
"revCount": 715614, "revCount": 806304,
"type": "tarball", "type": "tarball",
"url": "https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.2411.715614%2Brev-cdd2ef009676ac92b715ff26630164bb88fec4e0/019590d8-bf83-7849-9c87-9e373480fc07/source.tar.gz" "url": "https://api.flakehub.com/f/pinned/NixOS/nixpkgs/0.2505.806304%2Brev-dfcd5b901dbab46c9c6e80b265648481aafb01f8/01980f2c-e7f3-7efc-b369-7ebec7be6e59/source.tar.gz"
}, },
"original": { "original": {
"type": "tarball", "type": "tarball",
@@ -63,11 +63,11 @@
] ]
}, },
"locked": { "locked": {
"lastModified": 1742005800, "lastModified": 1752633862,
"narHash": "sha256-6wuOGWkyW6R4A6Th9NMi6WK2jjddvZt7V2+rLPk6L3o=", "narHash": "sha256-Bj7ozT1+5P7NmvDcuAXJvj56txcXuAhk3Vd9FdWFQzk=",
"owner": "oxalica", "owner": "oxalica",
"repo": "rust-overlay", "repo": "rust-overlay",
"rev": "028cd247a6375f83b94adc33d83676480fc9c294", "rev": "8668ca94858206ac3db0860a9dec471de0d995f8",
"type": "github" "type": "github"
}, },
"original": { "original": {

View File

@@ -61,19 +61,22 @@
wget wget
nixpkgs-fmt nixpkgs-fmt
openssl openssl
openssl.dev
patchelf patchelf
deno deno
sqlx-cli sqlx-cli
cargo-watch cargo-watch
cargo-chef cargo-chef
valkey valkey
pkg-config-unwrapped
]; ];
# Environment variables # Environment variables
env = { env = {
RUST_BACKTRACE = "1"; RUST_BACKTRACE = "1";
RUST_SRC_PATH = "${pkgs.rustToolchain}/lib/rustlib/src/rust/library"; RUST_SRC_PATH = "${pkgs.rustToolchain}/lib/rustlib/src/rust/library";
ZELLIJ_CONFIG_FILE = "config.kdl"; PKG_CONFIG_PATH = "${pkgs.openssl.dev}/lib/pkgconfig";
# ZELLIJ_CONFIG_FILE = "config.kdl";
# PATH = "$PATH:$HOME/.local/share/nvim/mason/bin/deno"; # PATH = "$PATH:$HOME/.local/share/nvim/mason/bin/deno";
}; };
}; };

17
frontend/Dockerfile Normal file
View File

@@ -0,0 +1,17 @@
FROM denoland/deno:alpine
RUN apk add bash
# USER deno
RUN deno cache --reload deno.json
COPY . .
RUN bash -c 'deno cache main.ts'
RUN bash -c 'deno task build'
EXPOSE 8000
CMD ["deno", "run", "-A", "main.ts"]

View File

@@ -2,54 +2,49 @@ import * as hi from "jsr:@preact-icons/hi2";
export default function Footer() { export default function Footer() {
return ( return (
<footer class="bg-[#313244] text-[#cdd6f4] py-8"> <footer class="bg-[#313244] text-[#cdd6f4] py-8 mt-auto">
<div class="container mx-auto px-4"> <div class="container mx-auto px-4">
{/* Grid layout that switches from 2 to 1 column on small screens */} {/* 2x2 grid on mobile, horizontal row on desktop - all centered */}
<div class="grid grid-cols-1 md:grid-cols-2 gap-8"> <div class="grid grid-cols-2 place-items-center md:flex md:flex-row items-center justify-center md:gap-8">
<div class="space-y-2"> <a
<a class="text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]"
class="mb-8 text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]" href="/rss"
href={`${Deno.env.get("BASE_URI_API")}/posts/rss`} >
> <div class="flex items-center gap-2">
<div class="flex items-center gap-2"> <hi.HiOutlineRss />
<hi.HiOutlineRss /> RSS
RSS </div>
</div> </a>
</a>
<a
class="mb-8 text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]"
href="/sitemap.xml"
>
<div class="flex items-center gap-2">
<hi.HiOutlineMap />
Sitemap
</div>
</a>
</div>
<div> <a
<a class="mb-8 text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]"> class="text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]"
<div class="flex items-center gap-2"> href="/sitemap"
<hi.HiOutlineBriefcase /> >
Resume <div class="flex items-center gap-2">
</div> <hi.HiOutlineMap />
</a> Sitemap
<a </div>
class="mb-8 text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]" </a>
href="mailto:wjmiller2016@gmail.com"
> <a
<div class="flex items-center gap-2"> class="text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]"
<hi.HiOutlineEnvelope /> href="/resume.pdf"
Email me >
</div> <div class="flex items-center gap-2">
</a> <hi.HiOutlineBriefcase />
<a Resume
class="mb-8 text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]" </div>
href="https://github.com/wymillerlinux" </a>
>
<div class="flex items-center gap-2">GitHub</div> <a
</a> class="text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer visited:text-[#bac2de]"
</div> href="mailto:wjmiller2016@gmail.com"
>
<div class="flex items-center gap-2">
<hi.HiOutlineEnvelope />
Email me
</div>
</a>
</div> </div>
<div class="border-t border-gray-700 mt-8 pt-4 text-center"> <div class="border-t border-gray-700 mt-8 pt-4 text-center">

View File

@@ -35,14 +35,14 @@ export default function Header() {
return ( return (
<header> <header>
<nav> <nav>
<div class="bg-[#313244] flex justify-center space-x-6 p-4"> <div class="grid grid-cols-2 mt-4 place-items-center md:flex md:flex-row items-center justify-center md:gap-8">
{headerLinks.map((l) => { {headerLinks.map((l) => {
const newTab = l.newTab ? "_blank" : "_self"; const newTab = l.newTab ? "_blank" : "_self";
return ( return (
<a <a
href={l.linkTo} href={l.linkTo}
target={newTab} target={newTab}
class="text-[#cdd6f4] text-lg font-medium transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer" class="text-[#cdd6f4] text-md sm:text-lg font-medium transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer"
> >
<div class="flex items-center gap-2"> <div class="flex items-center gap-2">
{l.icon} {l.name} {l.icon} {l.name}

View File

@@ -0,0 +1,89 @@
import * as hi from "jsr:@preact-icons/hi2";
export function PaginationControl({
paginatedData,
currentUrl,
authorId,
}: {
paginatedData: PaginatedPosts;
currentUrl: URL;
authorId: number;
}) {
const buildUrl = (page: number, limit?: number) => {
const params = new URLSearchParams(currentUrl.searchParams);
params.set("page", page.toString());
if (limit) params.set("limit", limit.toString());
return `${currentUrl.pathname}?${params.toString()}`;
};
if (paginatedData.totalPages <= 1) return null;
return (
<div class="mt-8 space-y-4">
{/* Pagination info and controls */}
<div class="flex flex-col sm:flex-row justify-center items-center gap-4">
<div class="flex items-center gap-2">
{paginatedData.hasPrevPage && (
<a
href={buildUrl(paginatedData.currentPage - 1)}
class="px-4 py-2 bg-[#45475a] text-[#cdd6f4] shadow-sm rounded hover:bg-[#6A6B7A] transition-colors"
>
<div class="flex items-center gap-2">
<hi.HiChevronDoubleLeft />
Previous
</div>
</a>
)}
{/* Page numbers */}
<div class="flex gap-1">
{Array.from(
{ length: Math.min(paginatedData.totalPages, 7) },
(_, i) => {
let pageNum;
if (paginatedData.totalPages <= 7) {
pageNum = i + 1;
} else {
const start = Math.max(1, paginatedData.currentPage - 3);
const end = Math.min(paginatedData.totalPages, start + 6);
pageNum = start + i;
if (pageNum > end) return null;
}
const isCurrentPage = pageNum === paginatedData.currentPage;
return (
<a
key={pageNum}
href={buildUrl(pageNum)}
class={`px-3 py-1 rounded text-sm shadow-sm ${
isCurrentPage
? "bg-[#6A6B7A] text-[#cdd6f4]"
: "bg-[#45475a] text-[#cdd6f4] hover:bg-[#6A6B7A]"
}`}
>
{pageNum}
</a>
);
},
)}
</div>
{paginatedData.hasNextPage && (
<a
href={buildUrl(paginatedData.currentPage + 1)}
class="px-4 py-2 bg-[#45475a] text-[#cdd6f4] shadow-sm rounded hover:bg-[#6A6B7A] transition-colors"
>
<div class="flex items-center gap-2">
Next
<hi.HiChevronDoubleRight />
</div>
</a>
)}
</div>
{/* Quick jump to page */}
</div>
</div>
);
}

View File

@@ -2,7 +2,13 @@ import { Post } from "../types/index.ts";
export const PostBody = function PostBody({ post }: PostBodyOpts) { export const PostBody = function PostBody({ post }: PostBodyOpts) {
return ( return (
<div class="p-6 bg-[#313244] shadow-md text-[#f5e0dc]">{post.body}</div> <div class="mx-auto max-w-4xl p-4 bg-[#313244]">
<div
class="p-6 bg-[#484659] shadow-md rounded-lg text-[#f5e0dc] post-content overflow-hidden break-words hyphens-auto max-w-full
[&>*]:max-w-5xl [&>*]:overflow-wrap-anywhere"
dangerouslySetInnerHTML={{ __html: post.body }}
></div>
</div>
); );
}; };

View File

@@ -2,22 +2,31 @@ import { convertUtc } from "../lib/convertUtc.ts";
import { truncateString } from "../lib/truncate.ts"; import { truncateString } from "../lib/truncate.ts";
import { Post } from "../types/index.ts"; import { Post } from "../types/index.ts";
export const PostCard = function PostCard({ post }: { post: Post }) { export const PostCard = function PostCard({
post,
colorValue,
}: {
post: Post;
colorValue: string;
}) {
return ( return (
<div class="p-6 bg-[#45475a] rounded-lg shadow-md transition-all duration-300 ease-in-out hover:shadow-xl hover:scale-105"> <div
class={`p-6 bg-[#484659] rounded-lg shadow-xl transition-all duration-300 ease-in-out border-b-4 hover:shadow-xl hover:scale-105`}
style={{ borderBottomColor: colorValue }}
>
<a href={`${Deno.env.get("BASE_URI_WEB")}/posts/${post.post_id}`}> <a href={`${Deno.env.get("BASE_URI_WEB")}/posts/${post.post_id}`}>
<h2 class="text-white text-lg font-bold mb-2">{post.title}</h2> <h2 class="text-white text-lg font-bold mb-2">{post.title}</h2>
<p class="text-white"> <p class="text-white">
Written by{" "} Written by{" "}
<a <a
class="text-white transition-all duration-300 ease-in-out hover:text-[#74c7ec] hover:drop-shadow-[0_0_10px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer" class="text-white transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_10px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer"
href={`${Deno.env.get("BASE_URI_WEB")}/authors/${post.author_id}`} href={`${Deno.env.get("BASE_URI_WEB")}/authors/${post.author_id}`}
> >
{post.first_name} {post.last_name} {post.first_name} {post.last_name}
</a>{" "} </a>{" "}
at {convertUtc(post.created_at)} at {convertUtc(post.publish_date)}
</p> </p>
<p class="text-gray-400">{truncateString(post.body, 15)}</p> <p class="text-gray-400">{truncateString(post.body, 45)}</p>
</a> </a>
</div> </div>
); );

View File

@@ -3,15 +3,19 @@ import { Post } from "../types/index.ts";
interface PostOpts { interface PostOpts {
posts: Post[]; posts: Post[];
colorValue: string;
} }
export const PostCarousel = function PostCarousel({ posts }: PostOpts) { export const PostCarousel = function PostCarousel({
posts,
colorValue,
}: PostOpts) {
return ( return (
<div className="flex w-full justify-start items-start bg-[#313244] p-8"> <div className="flex w-full justify-start items-start bg-[#313244] p-8">
<div className="max-w-7xl mx-auto"> <div className="max-w-7xl mx-auto">
<div className="flex flex-wrap justify-center gap-3"> <div className="flex flex-wrap justify-center gap-3">
{posts.map((post: Post) => ( {posts.map((post: Post) => (
<PostCard key={post.post_id} post={post} /> <PostCard key={post.post_id} post={post} colorValue={colorValue} />
))} ))}
</div> </div>
</div> </div>

View File

@@ -1,23 +1,29 @@
import { Head } from "$fresh/runtime.ts";
import { Post } from "../types/index.ts"; import { Post } from "../types/index.ts";
import { convertUtc } from "../lib/convertUtc.ts"; import { convertUtc } from "../lib/convertUtc.ts";
export const PostHeader = function PostHeader({ post }: PostHeaderOpts) { export const PostHeader = function PostHeader({ post }: PostHeaderOpts) {
return ( return (
<div class="p-6 bg-[#313244] shadow-md"> <>
<div class="min-w-screen flex flex-col items-center justify-between bg-[#45475a] rounded-lg shadow-md"> <Head>
<div class="sm:mt-14 sm:mb-14 mt-12 mb-4 flex flex-col items-center gap-y-5 gap-x-10 md:flex-row"> <title>Wyatt J. Miller | {post.title}</title>
<div class="space-y-2 text-center md:text-left"> </Head>
<p class="text-2xl text-[#f5e0dc] font-bold sm:text-4xl"> <div class="p-4 bg-[#313244]">
{post.title} <div class="min-w-screen flex flex-col items-center justify-between bg-[#484659] rounded-lg shadow-md">
</p> <div class="sm:mt-14 sm:mb-14 mt-8 mb-8 flex flex-col items-center gap-y-5 gap-x-10 md:flex-row">
<p class="text-md font-medium text-[#E39A9C] sm:text-xl italic"> <div class="space-y-2 text-center md:text-left">
by {post.first_name} {post.last_name} posted on{" "} <p class="text-2xl text-[#f5e0dc] font-bold sm:text-4xl">
{convertUtc(post.created_at)} {post.title}
</p> </p>
<p class="text-md font-medium text-[#E39A9C] sm:text-xl italic">
by {post.first_name} {post.last_name} posted on{" "}
{convertUtc(post.publish_date)}
</p>
</div>
</div> </div>
</div> </div>
</div> </div>
</div> </>
); );
}; };

View File

@@ -0,0 +1,13 @@
export const ShareLinkButton = function ShareLinkButton({ props }) {
const [text, setText] = useState("Share");
const onClickHandler = () => {
navigator.clipboard.writeText(location.href);
setText("Copied to clipboard!");
setTimeout(() => {
setText("Share");
}, 1000);
};
return <button onClick={onClickHandler}>{text}</button>;
};

View File

@@ -11,15 +11,10 @@
}, },
"lint": { "lint": {
"rules": { "rules": {
"tags": [ "tags": ["fresh", "recommended"]
"fresh",
"recommended"
]
} }
}, },
"exclude": [ "exclude": ["**/_fresh/*"],
"**/_fresh/*"
],
"imports": { "imports": {
"$fresh/": "https://deno.land/x/fresh@1.6.8/", "$fresh/": "https://deno.land/x/fresh@1.6.8/",
"$std/": "https://deno.land/std@0.216.0/", "$std/": "https://deno.land/std@0.216.0/",
@@ -33,7 +28,8 @@
"preact/jsx-runtime": "npm:preact@10.22.1/jsx-runtime", "preact/jsx-runtime": "npm:preact@10.22.1/jsx-runtime",
"tailwindcss": "npm:tailwindcss@3.4.1", "tailwindcss": "npm:tailwindcss@3.4.1",
"tailwindcss/": "npm:/tailwindcss@3.4.1/", "tailwindcss/": "npm:/tailwindcss@3.4.1/",
"tailwindcss/plugin": "npm:/tailwindcss@3.4.1/plugin.js" "tailwindcss/plugin": "npm:/tailwindcss@3.4.1/plugin.js",
"tailwind-highlightjs": "npm:tailwind-highlightjs"
}, },
"compilerOptions": { "compilerOptions": {
"jsx": "react-jsx", "jsx": "react-jsx",

View File

@@ -12,8 +12,12 @@ import * as $index from "./routes/index.tsx";
import * as $posts_id_ from "./routes/posts/[id].tsx"; import * as $posts_id_ from "./routes/posts/[id].tsx";
import * as $posts_index from "./routes/posts/index.tsx"; import * as $posts_index from "./routes/posts/index.tsx";
import * as $projects_index from "./routes/projects/index.tsx"; import * as $projects_index from "./routes/projects/index.tsx";
import * as $rss_index from "./routes/rss/index.tsx";
import * as $sitemap_index from "./routes/sitemap/index.tsx";
import * as $Counter from "./islands/Counter.tsx"; import * as $Counter from "./islands/Counter.tsx";
import * as $ProjectCard from "./islands/ProjectCard.tsx"; import * as $ProjectCard from "./islands/ProjectCard.tsx";
import * as $modal from "./islands/modal.tsx";
import * as $portal from "./islands/portal.tsx";
import { type Manifest } from "$fresh/server.ts"; import { type Manifest } from "$fresh/server.ts";
const manifest = { const manifest = {
@@ -28,10 +32,14 @@ const manifest = {
"./routes/posts/[id].tsx": $posts_id_, "./routes/posts/[id].tsx": $posts_id_,
"./routes/posts/index.tsx": $posts_index, "./routes/posts/index.tsx": $posts_index,
"./routes/projects/index.tsx": $projects_index, "./routes/projects/index.tsx": $projects_index,
"./routes/rss/index.tsx": $rss_index,
"./routes/sitemap/index.tsx": $sitemap_index,
}, },
islands: { islands: {
"./islands/Counter.tsx": $Counter, "./islands/Counter.tsx": $Counter,
"./islands/ProjectCard.tsx": $ProjectCard, "./islands/ProjectCard.tsx": $ProjectCard,
"./islands/modal.tsx": $modal,
"./islands/portal.tsx": $portal,
}, },
baseUrl: import.meta.url, baseUrl: import.meta.url,
} satisfies Manifest; } satisfies Manifest;

View File

@@ -1,28 +1,45 @@
import { useState } from "preact/hooks";
import { Portal } from "./portal.tsx";
import { Modal } from "./modal.tsx";
export const ProjectCard = function ProjectCard(props: ProjectProps) { export const ProjectCard = function ProjectCard(props: ProjectProps) {
const [open, setOpen] = useState(false);
return ( return (
<div <div
class={`group space-y-1 rounded-md ${ class={`md:m-8 group space-y-1 rounded-md ${
props.wip ? "border-2 border-dashed" : "cursor-pointer" props.wip ? "border-2" : "cursor-pointer"
} bg-[#45475a] px-3 py-2 m-10 shadow-md transition-all duration-300 ease-in-out hover:shadow-xl hover:scale-105`} } bg-[#44485b] px-3 py-2 m-4 shadow-md transition-all duration-300 ease-in-out border-b-4 border-b-[#94e2d5] hover:shadow-xl hover:scale-105`}
onClick={() => props.repo && open(props.repo, "_blank")} style={
props.wip
? {
borderTopStyle: "dashed",
borderRightStyle: "dashed",
borderLeftStyle: "dashed",
}
: {}
}
onClick={() => {
// clicking the card (not the link) opens the modal
console.log("opened portal");
setOpen(true);
}}
> >
<div class="flex items-center justify-between"> <div class="flex items-center justify-between">
<h2 class="text-lg text-white font-black uppercase"> <h2 class="text-lg text-white font-black uppercase">
<a href={props.repo} target="_blank"> <a
href={props.repo}
target="_blank"
onClick={(e) => {
// clicking the link should not open the modal
e.stopPropagation();
}}
>
{props.title} {props.title}
</a> </a>
</h2> </h2>
<div class="bg-[#585b70] text-[#a6adc8] text-xs font-bold uppercase px-2.5 py-0.5 rounded-full"> <div class="bg-[#585b70] text-[#a6adc8] text-xs font-bold uppercase px-2.5 py-0.5 rounded-full">
{props.repo && ( {props.repo && <span>Active</span>}
<a
class="hover:underline"
href={props.repo}
target="_blank"
onClick={(e) => e.stopPropagation()}
>
Active
</a>
)}
{!props.repo && !props.wip && <span>Dead</span>} {!props.repo && !props.wip && <span>Dead</span>}
{props.wip && <span>WIP</span>} {props.wip && <span>WIP</span>}
</div> </div>
@@ -33,6 +50,38 @@ export const ProjectCard = function ProjectCard(props: ProjectProps) {
<p class="whitespace-pre-wrap text-sm font-semibold text-[#a6adc8]"> <p class="whitespace-pre-wrap text-sm font-semibold text-[#a6adc8]">
{props.tech} {props.tech}
</p> </p>
{open && !props.wip ? (
<Portal into="body">
<Modal
title={props.title}
onClose={() => setOpen(false)}
actions={[
{
label: "Open repository",
onClick: () => {
if (props.repo) window.open(props.repo, "_blank");
},
variant: "primary",
},
{
label: "Close",
onClick: () => setOpen(false),
variant: "secondary",
},
]}
>
<div class="space-y-3">
<p class="text-sm text-gray-800 dark:text-gray-200">
{props.summary}
</p>
<p class="text-xs font-mono text-gray-600 dark:text-gray-300">
Technologies used: {props.tech}
</p>
</div>
</Modal>
</Portal>
) : null}
</div> </div>
); );
}; };

158
frontend/islands/modal.tsx Normal file
View File

@@ -0,0 +1,158 @@
import { useEffect, useRef, useState } from "preact/hooks";
import type { ComponentChildren } from "preact";
type ModalAction = {
label: string;
onClick: () => void;
variant?: "primary" | "secondary" | "link";
};
type ModalProps = {
title?: string;
/**
* Called after the modal has finished its exit animation.
* The Modal will run the exit animation internally and then call onClose().
*/
onClose: () => void;
children: ComponentChildren;
ariaLabel?: string;
actions?: ModalAction[]; // rendered in footer; each button will be given flex-1 so buttons fill the width together
/**
* Optional: duration (ms) of enter/exit animation. Defaults to 200.
* Keep this in sync with the CSS transition-duration classes used below.
*/
animationDurationMs?: number;
};
export function Modal({
title,
onClose,
children,
ariaLabel,
actions,
animationDurationMs = 200,
}: ModalProps) {
// Controls the entrance/exit animation state. true => visible (enter), false => hidden (exit)
const [isVisible, setIsVisible] = useState(false);
// Prevent double-triggering the close flow
const closingRef = useRef(false);
// Hold the timeout id for cleanup
const timeoutRef = useRef<number | null>(null);
useEffect(() => {
// Defer to next frame so initial "hidden" styles are applied before animating to visible.
const raf = requestAnimationFrame(() => setIsVisible(true));
function onKey(e: KeyboardEvent) {
if (e.key === "Escape") {
startCloseFlow();
}
}
document.addEventListener("keydown", onKey);
return () => {
cancelAnimationFrame(raf);
document.removeEventListener("keydown", onKey);
if (timeoutRef.current !== null) {
clearTimeout(timeoutRef.current);
}
};
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
// If no actions provided, render a single Close button
const footerActions: ModalAction[] =
actions && actions.length > 0
? actions
: [{ label: "Close", onClick: startCloseFlow, variant: "primary" }];
// Start exit animation and call onClose after animationDurationMs
function startCloseFlow() {
if (closingRef.current) return;
closingRef.current = true;
setIsVisible(false);
// Wait for the CSS transition to finish before signalling parent to actually unmount
timeoutRef.current = window.setTimeout(() => {
timeoutRef.current = null;
onClose();
}, animationDurationMs);
}
// Animation classes (enter & exit):
// - panel: transitions opacity + transform for a subtle fade + pop
// - backdrop: transitions opacity for fade
const panelBase =
"relative z-10 max-w-lg w-full bg-white dark:bg-[#1f2937] rounded-lg shadow-xl p-6 mx-4 transform transition-all";
// We explicitly set the CSS transition duration inline to keep class + timeout in sync.
const panelVisible = "opacity-100 translate-y-0 scale-100";
const panelHidden = "opacity-0 translate-y-2 scale-95";
const backdropBase =
"absolute inset-0 bg-black/50 backdrop-blur-sm transition-opacity";
const backdropVisible = "opacity-100";
const backdropHidden = "opacity-0";
// Footer button class generator
const renderActionButton = (act: ModalAction) => {
const base =
"flex-1 w-full px-4 py-2 rounded-md font-semibold focus:outline-none";
const styles =
act.variant === "primary"
? "bg-[#94e2d5] text-black hover:brightness-95"
: act.variant === "link"
? "bg-transparent text-[#075985] underline"
: "bg-gray-200 dark:bg-gray-700 text-gray-800 dark:text-gray-200 hover:brightness-95";
return (
<button key={act.label} onClick={act.onClick} class={`${base} ${styles}`}>
{act.label}
</button>
);
};
return (
<div
class="fixed inset-0 z-50 flex items-center justify-center"
aria-modal="true"
role="dialog"
aria-label={ariaLabel ?? title ?? "Modal dialog"}
>
{/* Backdrop */}
<div
// inline style for transitionDuration to keep JS timeout and CSS synced
style={{ transitionDuration: `${animationDurationMs}ms` }}
class={`${backdropBase} ${isVisible ? backdropVisible : backdropHidden}`}
onClick={startCloseFlow}
/>
{/* Modal panel */}
<div
style={{ transitionDuration: `${animationDurationMs}ms` }}
class={`${panelBase} ${isVisible ? panelVisible : panelHidden}`}
onClick={(e) => e.stopPropagation()}
>
<div class="flex items-start justify-between">
<h3 class="text-lg font-semibold text-gray-900 dark:text-white">
{title}
</h3>
<button
onClick={startCloseFlow}
aria-label="Close modal"
class="ml-4 rounded-md text-gray-700 dark:text-gray-300 hover:bg-gray-100 dark:hover:bg-gray-700 p-1"
>
</button>
</div>
<div class="mt-4 text-sm text-gray-700 dark:text-gray-300">
{children}
</div>
<div class="mt-6">
<div class="flex gap-3 w-full">
{footerActions.map(renderActionButton)}
</div>
</div>
</div>
</div>
);
}

View File

@@ -0,0 +1,38 @@
import { useEffect, useState } from "preact/hooks";
import { createPortal } from "preact/compat";
import type { ComponentChildren } from "preact";
type PortalProps = {
into?: string | HTMLElement;
children: ComponentChildren;
};
export function Portal({ into = "body", children }: PortalProps) {
const [host, setHost] = useState<HTMLElement | null>(null);
useEffect(() => {
if (typeof document === "undefined") return;
let target: HTMLElement | null = null;
if (typeof into === "string") {
target = into === "body" ? document.body : document.querySelector(into);
} else {
target = into;
}
if (!target) target = document.body;
const wrapper = document.createElement("div");
wrapper.className = "preact-portal-root";
target.appendChild(wrapper);
setHost(wrapper);
return () => {
if (wrapper.parentNode) wrapper.parentNode.removeChild(wrapper);
setHost(null);
};
}, [into]);
if (!host) return null;
return createPortal(children, host);
}

View File

@@ -1,3 +1,4 @@
export const truncateString = (str: string, maxLength: number) => { export const truncateString = (str: string, maxLength: number) => {
str = str.replace(/<[^>]*>/g, "");
return str.length > maxLength ? `${str.slice(0, maxLength)}...` : str; return str.length > maxLength ? `${str.slice(0, maxLength)}...` : str;
}; };

View File

@@ -13,7 +13,7 @@ export default function Error404() {
The page you were looking for doesn't exist! The page you were looking for doesn't exist!
</p> </p>
<a <a
href="/" href={`${Deno.env.get("BASE_URI_WEB")}/`}
class="text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer" class="text-[#cdd6f4] transition-all duration-300 ease-in-out hover:text-[#cba6f7] hover:drop-shadow-[0_0_20px_rgba(96,165,250,0.7)] hover:scale-110 cursor-pointer"
> >
Go back home Go back home

View File

@@ -8,7 +8,7 @@ export default function App({ Component }: PageProps) {
<title>frontend</title> <title>frontend</title>
<link rel="stylesheet" href="/styles.css" /> <link rel="stylesheet" href="/styles.css" />
</head> </head>
<body> <body class="bg-[#313244]">
<Component /> <Component />
</body> </body>
</html> </html>

View File

@@ -2,13 +2,20 @@ import { FreshContext, Handlers, PageProps } from "$fresh/server.ts";
import AuthorCard from "../../components/AuthorCard.tsx"; import AuthorCard from "../../components/AuthorCard.tsx";
import { Post } from "../../types/index.ts"; import { Post } from "../../types/index.ts";
import { PostCarousel } from "../../components/PostCarousel.tsx"; import { PostCarousel } from "../../components/PostCarousel.tsx";
import { PaginationControl } from "../../components/PaginationControl.tsx";
export const handler: Handlers<PageData> = { export const handler: Handlers<PageData> = {
async GET(_req: Request, ctx: FreshContext) { async GET(req: Request, ctx: FreshContext) {
try { try {
const url = new URL(req.url);
const page = parseInt(url.searchParams.get("page") || "1");
const limit = parseInt(url.searchParams.get("limit") || "12");
const [authorResponse, authorPostResponse] = await Promise.all([ const [authorResponse, authorPostResponse] = await Promise.all([
fetch(`${Deno.env.get("BASE_URI_API")}/authors/${ctx.params.id}`), fetch(`${Deno.env.get("BASE_URI_API")}/authors/${ctx.params.id}`),
fetch(`${Deno.env.get("BASE_URI_API")}/authors/${ctx.params.id}/posts`), fetch(
`${Deno.env.get("BASE_URI_API")}/authors/${ctx.params.id}/posts?page=${page}&limit=${limit}`,
),
]); ]);
const [authorData, authorPostData] = await Promise.all([ const [authorData, authorPostData] = await Promise.all([
@@ -16,9 +23,37 @@ export const handler: Handlers<PageData> = {
authorPostResponse.json(), authorPostResponse.json(),
]); ]);
let paginatedData: PaginatedPosts;
if (authorPostData.posts && authorPostData.total_posts !== undefined) {
const totalPages = Math.ceil(authorPostData.total_posts / limit);
paginatedData = {
posts: authorPostData.posts,
currentPage: page,
totalPages,
hasNextPage: page < totalPages,
hasPrevPage: page > 1,
totalPosts: authorPostData.total_posts,
};
} else {
const allPosts = Array.isArray(authorPostData) ? authorPostData : [];
const totalPages = Math.ceil(allPosts.length / limit);
const startIndex = (page - 1) * limit;
const endIndex = startIndex + limit;
paginatedData = {
posts: allPosts.slice(startIndex, endIndex),
currentPage: page,
totalPages,
hasNextPage: page < totalPages,
hasPrevPage: page > 1,
totalPosts: allPosts.length,
};
}
return ctx.render({ return ctx.render({
authorData, authorData,
authorPostData, authorPostData: paginatedData,
}); });
} catch (error) { } catch (error) {
return ctx.render({ return ctx.render({
@@ -30,7 +65,7 @@ export const handler: Handlers<PageData> = {
}, },
}; };
export default function AuthorIdentifier({ data }: PageProps<PageData>) { export default function AuthorIdentifier({ data, url }: PageProps<PageData>) {
const { authorData, authorPostData, error } = data; const { authorData, authorPostData, error } = data;
if (error) { if (error) {
@@ -52,7 +87,12 @@ export default function AuthorIdentifier({ data }: PageProps<PageData>) {
<AuthorCard author={authorData} isIdentified={true} /> <AuthorCard author={authorData} isIdentified={true} />
</div> </div>
<div> <div>
<PostCarousel posts={authorPostData} /> <PostCarousel posts={authorPostData.posts} />
<PaginationControl
paginatedData={authorPostData}
currentUrl={url}
authorId={authorData.author_id}
/>
</div> </div>
</> </>
); );

View File

@@ -26,7 +26,6 @@ export const handler: Handlers = {
message: formData.get("message")?.toString(), message: formData.get("message")?.toString(),
}; };
// Validation logic
const errors: FormState["errors"] = {}; const errors: FormState["errors"] = {};
if (!state.name || state.name.trim() === "") { if (!state.name || state.name.trim() === "") {
@@ -44,7 +43,6 @@ export const handler: Handlers = {
errors.message = "Message is required"; errors.message = "Message is required";
} }
// If there are errors, return the form with error messages
if (Object.keys(errors).length > 0) { if (Object.keys(errors).length > 0) {
return ctx.render({ return ctx.render({
...state, ...state,
@@ -56,7 +54,6 @@ export const handler: Handlers = {
method: "POST", method: "POST",
body: formData, body: formData,
}); });
console.log(res);
if (!res.ok || res.status !== 200) { if (!res.ok || res.status !== 200) {
return ctx.render({ return ctx.render({
@@ -77,11 +74,15 @@ export default function Contact({ data }: PageProps<FormState>) {
<div class="bg-[#313244] min-h-screen"> <div class="bg-[#313244] min-h-screen">
<div class="px-4 py-8 mx-auto p-6 flex flex-col bg-[#313244] min-h-screen w-full md:max-w-md"> <div class="px-4 py-8 mx-auto p-6 flex flex-col bg-[#313244] min-h-screen w-full md:max-w-md">
<Head> <Head>
<title>Contact</title> <title>Wyatt J. Miller | Contact</title>
</Head> </Head>
<h1 class="text-3xl text-white font-bold uppercase text-center"> <h1 class="text-3xl text-white font-bold uppercase text-center">
Contact Contact
</h1> </h1>
<p class="md:text-lg sm:text-md text-white mt-5 mb-5">
Got a question? Here to yell at me? Send me something!
</p>
<br />
{data?.submitted && ( {data?.submitted && (
<div <div
class="bg-[#a6e3a1] text-[#313244] px-4 py-3 rounded relative" class="bg-[#a6e3a1] text-[#313244] px-4 py-3 rounded relative"
@@ -114,7 +115,7 @@ export default function Contact({ data }: PageProps<FormState>) {
required required
placeholder="Your Name" placeholder="Your Name"
value={data?.name || ""} value={data?.name || ""}
class={`w-full px-3 py-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 class={`w-full px-3 py-2 bg-[#ECECEE] border rounded-md focus:outline-transparent
${data?.errors?.name ? "border-[#f38ba8]" : "border-[#313244]"}`} ${data?.errors?.name ? "border-[#f38ba8]" : "border-[#313244]"}`}
/> />
{data?.errors?.name && ( {data?.errors?.name && (
@@ -137,7 +138,7 @@ export default function Contact({ data }: PageProps<FormState>) {
required required
placeholder="your@email.com" placeholder="your@email.com"
value={data?.email || ""} value={data?.email || ""}
class={`w-full px-3 py-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 class={`w-full px-3 py-2 bg-[#ECECEE] border rounded-md focus:outline-transparent
${data?.errors?.email ? "border-[#f38ba8]" : "border-[#313244]"}`} ${data?.errors?.email ? "border-[#f38ba8]" : "border-[#313244]"}`}
/> />
{data?.errors?.email && ( {data?.errors?.email && (
@@ -159,7 +160,7 @@ export default function Contact({ data }: PageProps<FormState>) {
required required
placeholder="Write your message here..." placeholder="Write your message here..."
rows={4} rows={4}
class={`w-full px-3 py-2 border rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 class={`w-full px-3 py-2 bg-[#ECECEE] border rounded-md focus:outline-transparent
${data?.errors?.message ? "border-red-500" : "border-gray-300"}`} ${data?.errors?.message ? "border-red-500" : "border-gray-300"}`}
> >
{data?.message || ""} {data?.message || ""}
@@ -174,7 +175,7 @@ export default function Contact({ data }: PageProps<FormState>) {
<div> <div>
<button <button
type="submit" type="submit"
class="w-full bg-[#89b4fa] text-[#313244] py-2 px-4 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500" class="w-full bg-[#44475b] text-[#cdd6f4] py-2 px-4 rounded-md focus:outline-none focus:ring-2 focus:ring-blue-500 shadow-md"
> >
Send Message Send Message
</button> </button>

View File

@@ -2,8 +2,8 @@ import { PhotoCircle } from "../components/PhotoCircle.tsx";
export default function Home() { export default function Home() {
return ( return (
<body> <body class="bg-[#313244]">
<div class="min-w-screen flex flex-col items-center justify-between bg-[#313244] sm:min-h-screen"> <div class="flex flex-col items-center justify-between min-h-screen">
<div class="sm:mt-14 sm:mb-14 mt-12 mb-4 flex flex-col items-center gap-y-5 gap-x-10 md:flex-row"> <div class="sm:mt-14 sm:mb-14 mt-12 mb-4 flex flex-col items-center gap-y-5 gap-x-10 md:flex-row">
<PhotoCircle <PhotoCircle
src="https://wyattjmiller.us-ord-1.linodeobjects.com/IMG_1480-min.png" src="https://wyattjmiller.us-ord-1.linodeobjects.com/IMG_1480-min.png"

View File

@@ -27,7 +27,6 @@ export const handler: Handlers<PageData> = {
export default function PostIdentifier({ data }: PageProps<PageData>) { export default function PostIdentifier({ data }: PageProps<PageData>) {
const { postData } = data; const { postData } = data;
console.log(postData);
return ( return (
<div> <div>

View File

@@ -52,10 +52,11 @@ export default function PostPage({ data }: PageProps<PageData>) {
Featured Posts Featured Posts
</h2> </h2>
</div> </div>
<div className="text-lg font-thin italic text-white mb-4 text-center flex">
<div className="text-lg font-thin italic text-white mb-4 text-center flex underline decoration-[#89b4fa] decoration-2">
Ignite the impossible Ignite the impossible
</div> </div>
<PostCarousel posts={featuredPosts} /> <PostCarousel posts={featuredPosts} colorValue="#89b4fa" />
</section> </section>
<section> <section>
<div class="flex items-center gap-2 text-2xl text-white md:justify-start"> <div class="flex items-center gap-2 text-2xl text-white md:justify-start">
@@ -64,22 +65,10 @@ export default function PostPage({ data }: PageProps<PageData>) {
Recent Posts Recent Posts
</h2> </h2>
</div> </div>
<div className="text-lg font-thin italic mb-4 text-white text-center flex"> <div className="text-lg font-thin italic mb-4 text-white text-center flex underline decoration-[#89dceb] decoration-2">
Now with 100% fresh perspective Now with 100% fresh perspective
</div> </div>
<PostCarousel posts={recentPosts} /> <PostCarousel posts={recentPosts} colorValue="#89dceb" />
</section>
<section>
<div class="flex items-center gap-2 text-2xl text-white md:justify-start">
<hi.HiOutlineFire />
<h2 class="text-2xl font-bold text-white text-center lg:text-left">
Hot Posts
</h2>
</div>
<div className="text-lg font-thin italic mb-4 text-white text-center flex">
Making chaos look cool since forever
</div>
<PostCarousel posts={hotPosts} />
</section> </section>
<section> <section>
<div class="flex items-center gap-2 text-2xl text-white md:justify-start"> <div class="flex items-center gap-2 text-2xl text-white md:justify-start">
@@ -88,10 +77,10 @@ export default function PostPage({ data }: PageProps<PageData>) {
Popular Posts Popular Posts
</h2> </h2>
</div> </div>
<div className="text-lg font-thin italic mb-4 text-white text-center flex"> <div className="text-lg font-thin italic mb-4 text-white text-center flex underline decoration-[#b4befe] decoration-2">
Content may cause uncontrollable reading Content may cause uncontrollable reading
</div> </div>
<PostCarousel posts={popularPosts} /> <PostCarousel posts={popularPosts} colorValue="#b4befe" />
</section> </section>
</div> </div>
); );

View File

@@ -1,65 +1,59 @@
import { FreshContext, Handlers, PageProps } from "$fresh/server.ts";
import { ProjectCard } from "../../islands/ProjectCard.tsx"; import { ProjectCard } from "../../islands/ProjectCard.tsx";
export default function Projects() { interface ProjectData {
project_id: number;
title: string;
repo?: string;
summary: string;
tech: string;
wip?: boolean;
created_at: string;
}
export const handler: Handlers<ProjectData> = {
async GET(_req: Request, ctx: FreshContext) {
const projectResult = await fetch(
`${Deno.env.get("BASE_URI_API")}/projects`,
);
const projectData = await projectResult.json();
return ctx.render({
projectData,
});
},
};
export default function Projects({ data }: PageProps<ProjectData>) {
const { projectData: projects } = data;
return ( return (
<div class="space-y-12 px-10 py-8 sm:min-h-screen bg-[#313244]"> <div class="space-y-12 px-10 py-8 sm:min-h-screen bg-[#313244]">
<section <section
id="projects" id="projects"
class="lg:grid-cols-desktop grid scroll-mt-16 grid-cols-1 gap-x-10 gap-y-4 bg-[#313244] " class="lg:grid-cols-desktop grid scroll-mt-8 grid-cols-1 gap-x-4 gap-y-2 bg-[#313244] "
> >
<h1 class="text-3xl text-white font-bold uppercase text-center"> <h1 class="text-3xl text-white font-bold uppercase text-center">
Projects Projects
</h1> </h1>
<div class="grid grid-cols-1 sm:grid-cols-2 "> <p class="md:text-lg sm:text-md text-white">
<ProjectCard Here's a collection of software and electronics projects I've been
wip tinkering with during my free time - some are ongoing adventures,
title="Website v2" others are finished experiments, but they've all been exciting
summary="This website was built by yours truly!" challenges that keep me busy when I'm not doing "real work" stuff!
// repo="https://scm.wyattjmiller.com/wymiller/my-website-v2" </p>
tech="Typescript, Deno, Fresh, Tailwind, Rust, PostgreSQL, Docker" <div class="grid grid-cols-1 sm:grid-cols-2">
/> {projects.map((project: any) => {
<ProjectCard return (
title="BallBot" <ProjectCard
repo="https://scm.wyattjmiller.com/wymiller/ballbot" title={project.title}
summary="A Discord bot that tells me NFL games, teams, and more!" repo={project.repo ?? undefined}
tech="Rust, Discord SDK, Docker" summary={project.summary}
/> tech={project.tech}
<ProjectCard wip={project.wip ?? true}
title="Nix configurations" />
repo="https://scm.wyattjmiller.com/wymiller/nix-config-v2" );
summary="My 'master' declarative system configuration for multiple computers" })}
tech="Nix"
/>
<ProjectCard
wip
title="omega"
summary="Music bot for Discord that plays music from different music sources"
tech="Rust, Discord SDK, SurrealDB, yt-dlp"
/>
<ProjectCard
title="gt"
repo="https://scm.wyattjmiller.com/wymiller/gt"
summary="Command line application to interact with Gitea"
tech="Rust"
/>
<ProjectCard
title="The Boyos Bot"
repo="https://github.com/NoahFlowa/BoyosBot"
summary="All-in-one Discord bot, built with my friend, NoahFlowa"
tech="Javascript, Node, Discord SDK, Docker"
/>
<ProjectCard
title="drillsergeant"
repo="https://scm.wyattjmiller.com/wymiller/drillsergeant"
summary="Git commit counter, to scratch an itch I had"
tech="C#, .NET"
/>
<ProjectCard
title="bleak"
repo="https://scm.wyattjmiller.com/wymiller/bleak"
summary="Turns your Raspberry Pi into a lighting controller"
tech="Rust"
/>
</div> </div>
</section> </section>
</div> </div>

View File

@@ -0,0 +1,3 @@
export function handler(req: Request): Response {
return Response.redirect(`${Deno.env.get("RSS_URI")}`, 307);
}

View File

@@ -0,0 +1,3 @@
export function handler(req: Request): Response {
return Response.redirect(`${Deno.env.get("SITEMAP_URI")}`, 307);
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 22 KiB

After

Width:  |  Height:  |  Size: 4.2 KiB

View File

@@ -0,0 +1 @@
Sitemap: https://wyattjmiller.us-ord-1.linodeobjects.com/feed.xml

Some files were not shown because too many files have changed in this diff Show More