Compare commits

...

19 commits

Author SHA1 Message Date
0aef4e7260
fix(ci): correctly use magic-nix-cache
Some checks failed
CI / build-nix (push) Has been cancelled
CI / deploy (push) Has been cancelled
2024-10-31 23:06:53 -07:00
06519c40be
fix(ci): remove cachix install nix action 2024-10-31 23:05:59 -07:00
92d10b707d
fix(ci): idk bro 2024-10-31 23:03:46 -07:00
c36a000684
fix(ci): remove cachix push and add magic nix cache 2024-10-31 23:01:28 -07:00
b8ade68874
fix: remove node_modules symlink during buildPhase 2024-10-31 22:55:52 -07:00
87630118d8
docs: update readme with new instructions 2024-10-31 22:51:28 -07:00
97e20c9db4
feat: build site entirely in nix 2024-10-31 22:41:33 -07:00
6585515da7
content: add draft of nix post 2024-10-31 21:39:57 -07:00
db02a9127f
style: italicize header 2024-10-31 20:02:59 -07:00
66a6a1cdf9
style: don't expand gradient ascent header 2024-10-31 19:54:19 -07:00
c803aea65c
style: update cursor for <summary> 2024-10-31 19:52:54 -07:00
a8b1c8c040
style: add paragraph styling to <detail> 2024-10-31 19:47:40 -07:00
36b5610592
update 2024-10-31 19:37:30 -07:00
46a513458e
chore: deleted unused pictures 2024-10-31 17:53:09 -07:00
f377b5ca11
chore: run nixfmt 2024-10-31 17:52:31 -07:00
Youwen Wu
aea6f85d0e
Merge pull request #3 from youwen5/dependabot/github_actions/cachix/install-nix-action-30
Some checks failed
CI / build-nix (push) Has been cancelled
CI / deploy (push) Has been cancelled
2024-10-03 20:00:46 -07:00
dependabot[bot]
894f0e80f7
chore(deps): bump cachix/install-nix-action from 29 to 30
Bumps [cachix/install-nix-action](https://github.com/cachix/install-nix-action) from 29 to 30.
- [Release notes](https://github.com/cachix/install-nix-action/releases)
- [Commits](https://github.com/cachix/install-nix-action/compare/v29...v30)

---
updated-dependencies:
- dependency-name: cachix/install-nix-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-04 00:38:59 +00:00
Youwen Wu
9588565b7a
Merge pull request #2 from youwen5/dependabot/github_actions/cachix/install-nix-action-v29
Some checks failed
CI / build-nix (push) Has been cancelled
CI / deploy (push) Has been cancelled
2024-09-28 19:28:30 -07:00
dependabot[bot]
54e638cf8a
chore(deps): bump cachix/install-nix-action from V27 to 29
Bumps [cachix/install-nix-action](https://github.com/cachix/install-nix-action) from V27 to 29. This release includes the previously tagged commit.
- [Release notes](https://github.com/cachix/install-nix-action/releases)
- [Commits](https://github.com/cachix/install-nix-action/compare/V27...v29)

---
updated-dependencies:
- dependency-name: cachix/install-nix-action
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-09-27 00:39:12 +00:00
20 changed files with 3548 additions and 2308 deletions

View file

@ -9,24 +9,20 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: DeterminateSystems/nix-installer-action@main
- uses: DeterminateSystems/magic-nix-cache-action@main
- name: Install Nix
uses: cachix/install-nix-action@V27
with:
github_access_token: ${{ secrets.GITHUB_TOKEN }}
extra_nix_config: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
allow-import-from-derivation = true
auto-optimise-store = true
experimental-features = nix-command flakes
substituters = https://cache.nixos.org https://cache.iog.io
trusted-public-keys = cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY= hydra.iohk.io:f/Ea+s+dFdN+3Y/G+FDgSq+a5NEWhJGzdjvKNGv0/EQ=
- name: Build with cachix
uses: cachix/cachix-action@v15
with:
name: hakyll-nix-template
authToken: ${{ secrets.CACHIX_AUTH_TOKEN }}
# - name: Install Nix
# uses: cachix/install-nix-action@v30
# with:
# github_access_token: ${{ secrets.GITHUB_TOKEN }}
# extra_nix_config: |
# access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
# allow-import-from-derivation = true
# auto-optimise-store = true
# experimental-features = nix-command flakes
# substituters = https://cache.nixos.org https://cache.iog.io
# trusted-public-keys = cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY= hydra.iohk.io:f/Ea+s+dFdN+3Y/G+FDgSq+a5NEWhJGzdjvKNGv0/EQ=
- run: nix build --accept-flake-config

1
.gitignore vendored
View file

@ -5,6 +5,5 @@ _cache
_tmp
dist
dist-newstyle
node_modules
result
.direnv

View file

@ -7,30 +7,77 @@ powered by [hakyll](https://jaspervdj.be/hakyll/) and
This repo is merely the source code, the actual site is hosted at
[blog.youwen.dev](https://blog.youwen.dev).
To build locally, install `nix` and enable flakes.
To build locally, install `nix` and enable flakes. Additionally, install the
`direnv` tool so that the provided binary utilities can be hooked into your
shell. It is also possible to perform the following steps without `direnv` if
you know what you are doing.
Allow the `.envrc`:
```sh
nix build
nix run . watch
direnv allow
```
This starts a hot reload server at `localhost:8000`.
Wait for the build to finish. Now, you will have the `rollup` and `hakyll-site`
binaries in your PATH.
We need to compile the site source code first, and then inject the bundled CSS
and JS using `rollup`. This is done automatically by `nix build`, which is used
for GitHub Pages deployment, but it is inconvenient for local development.
Here's how to do it locally.
First, we need to build the site. Run
```sh
nix run . build
hakyll-site build
# sometimes, we need to ignore the cache if things aren't working
hakyll-site rebuild
# can also use `watch` for convenient development
hakyll-site watch
# starts dev server at localhost:8000
```
This builds a local production version.
This will create `./dist`, containing the static assets. However, the required
CSS and JS is not in there yet! That is built by `rollup`, since we are using
`tailwindcss` and `postcss` and some JS minifying tools.
If any updates are made to the JavaScript or CSS, you will need to run
First, we need the `node_modules`. We don't provide a `package-lock.json` since
we don't use `npm` to manage node modules. Therefore, we need to obtain the
`node_modules` used by the project.
In the directory, there is a `node_modules` symlink to
`result/lib/node_modules`. If we build the `nodeDeps` package, the
`node_modules` will be made available at this path. So, run the following:
```sh
pnpm install # only the first time
pnpm build
nix build .#nodeDeps
```
This is because I still haven't figured out how to integrate the `rollup` build
pipeline with `nix`. Since the CSS and JS are minimal, I just do it manually for
now.
This will install the node modules in the Nix store and create the `result`
symlink. Keep in mind that if this `result` symlink is ever overwritten, you
need to re-run the above command or else node_modules will not be accessible.
Finally, run the following to generate the bundled CSS and JS files.
```
rollup -c
```
You have to re-run this whenever you change the CSS and JS files in `src/`.
Keep in mind that if `hakyll-site` ever overwrites `dist/out`, you will also
have to re-run this command.
<!--```sh-->
<!--nix build-->
<!---->
<!--nix run . watch-->
<!--```-->
<!---->
<!--This starts a hot reload server at `localhost:8000`.-->
<!---->
<!--```sh-->
<!--nix run . build-->
<!--```-->

View file

@ -1,5 +1,5 @@
{
description = "hakyll-nix-template";
description = "gradient ascent";
nixConfig = {
allow-import-from-derivation = "true";
@ -16,19 +16,31 @@
inputs.nixpkgs.follows = "haskellNix/nixpkgs-unstable";
inputs.flake-utils.url = "github:numtide/flake-utils";
outputs = { self, nixpkgs, flake-utils, haskellNix }:
flake-utils.lib.eachDefaultSystem (system:
outputs =
{
self,
nixpkgs,
flake-utils,
haskellNix,
}:
flake-utils.lib.eachDefaultSystem (
system:
let
hls = pkgs.haskell-language-server;
overlays = [ haskellNix.overlay
overlays = [
haskellNix.overlay
(final: prev: {
hakyllProject = final.haskell-nix.project' {
src = ./ssg;
compiler-nix-name = "ghc948";
modules = [{ doHaddock = false; }];
modules = [ { doHaddock = false; } ];
shell.buildInputs = [
hakyll-site
hls
nodejs
pkgs.nodePackages.rollup
pkgs.nodePackages.npm
pkgs.node2nix
];
shell.tools = {
cabal = "latest";
@ -44,7 +56,11 @@
inherit (haskellNix) config;
};
flake = pkgs.hakyllProject.flake {};
nodejs = pkgs.nodejs;
nodeDeps = (pkgs.callPackage ./nix { inherit pkgs nodejs system; }).nodeDependencies;
flake = pkgs.hakyllProject.flake { };
executable = "ssg:exe:hakyll-site";
@ -52,7 +68,10 @@
website = pkgs.stdenv.mkDerivation {
name = "website";
buildInputs = [];
buildInputs = [
nodejs
pkgs.nodePackages.rollup
];
src = pkgs.nix-gitignore.gitignoreSourcePure [
./.gitignore
".git"
@ -64,12 +83,19 @@
# https://github.com/NixOS/nix/issues/318#issuecomment-52986702
# https://github.com/MaxDaten/brutal-recipes/blob/source/default.nix#L24
LANG = "en_US.UTF-8";
LOCALE_ARCHIVE = pkgs.lib.optionalString
(pkgs.buildPlatform.libc == "glibc")
"${pkgs.glibcLocales}/lib/locale/locale-archive";
LOCALE_ARCHIVE = pkgs.lib.optionalString (
pkgs.buildPlatform.libc == "glibc"
) "${pkgs.glibcLocales}/lib/locale/locale-archive";
buildPhase = ''
# remove the node_modules symlink
rm -rf node_modules
${flake.packages.${executable}}/bin/hakyll-site build --verbose
ln -s ${nodeDeps}/lib/node_modules ./node_modules
export PATH="${nodeDeps}/bin:$PATH"
rollup -c
'';
installPhase = ''
@ -78,7 +104,9 @@
'';
};
in flake // rec {
in
flake
// {
apps = {
default = flake-utils.lib.mkApp {
drv = hakyll-site;
@ -87,9 +115,11 @@
};
packages = {
inherit hakyll-site website;
inherit hakyll-site website nodeDeps;
default = website;
};
formatter = pkgs.nixfmt-rfc-style;
}
);
}

17
nix/default.nix Normal file
View file

@ -0,0 +1,17 @@
# This file has been generated by node2nix 1.11.1. Do not edit!
{pkgs ? import <nixpkgs> {
inherit system;
}, system ? builtins.currentSystem, nodejs ? pkgs."nodejs_16"}:
let
nodeEnv = import ./node-env.nix {
inherit (pkgs) stdenv lib python2 runCommand writeTextFile writeShellScript;
inherit pkgs nodejs;
libtool = if pkgs.stdenv.isDarwin then pkgs.darwin.cctools else null;
};
in
import ./node-package.nix {
inherit (pkgs) fetchurl nix-gitignore stdenv lib fetchgit;
inherit nodeEnv;
}

689
nix/node-env.nix Normal file
View file

@ -0,0 +1,689 @@
# This file originates from node2nix
{lib, stdenv, nodejs, python2, pkgs, libtool, runCommand, writeTextFile, writeShellScript}:
let
# Workaround to cope with utillinux in Nixpkgs 20.09 and util-linux in Nixpkgs master
utillinux = if pkgs ? utillinux then pkgs.utillinux else pkgs.util-linux;
python = if nodejs ? python then nodejs.python else python2;
# Create a tar wrapper that filters all the 'Ignoring unknown extended header keyword' noise
tarWrapper = runCommand "tarWrapper" {} ''
mkdir -p $out/bin
cat > $out/bin/tar <<EOF
#! ${stdenv.shell} -e
$(type -p tar) "\$@" --warning=no-unknown-keyword --delay-directory-restore
EOF
chmod +x $out/bin/tar
'';
# Function that generates a TGZ file from a NPM project
buildNodeSourceDist =
{ name, version, src, ... }:
stdenv.mkDerivation {
name = "node-tarball-${name}-${version}";
inherit src;
buildInputs = [ nodejs ];
buildPhase = ''
export HOME=$TMPDIR
tgzFile=$(npm pack | tail -n 1) # Hooks to the pack command will add output (https://docs.npmjs.com/misc/scripts)
'';
installPhase = ''
mkdir -p $out/tarballs
mv $tgzFile $out/tarballs
mkdir -p $out/nix-support
echo "file source-dist $out/tarballs/$tgzFile" >> $out/nix-support/hydra-build-products
'';
};
# Common shell logic
installPackage = writeShellScript "install-package" ''
installPackage() {
local packageName=$1 src=$2
local strippedName
local DIR=$PWD
cd $TMPDIR
unpackFile $src
# Make the base dir in which the target dependency resides first
mkdir -p "$(dirname "$DIR/$packageName")"
if [ -f "$src" ]
then
# Figure out what directory has been unpacked
packageDir="$(find . -maxdepth 1 -type d | tail -1)"
# Restore write permissions to make building work
find "$packageDir" -type d -exec chmod u+x {} \;
chmod -R u+w "$packageDir"
# Move the extracted tarball into the output folder
mv "$packageDir" "$DIR/$packageName"
elif [ -d "$src" ]
then
# Get a stripped name (without hash) of the source directory.
# On old nixpkgs it's already set internally.
if [ -z "$strippedName" ]
then
strippedName="$(stripHash $src)"
fi
# Restore write permissions to make building work
chmod -R u+w "$strippedName"
# Move the extracted directory into the output folder
mv "$strippedName" "$DIR/$packageName"
fi
# Change to the package directory to install dependencies
cd "$DIR/$packageName"
}
'';
# Bundle the dependencies of the package
#
# Only include dependencies if they don't exist. They may also be bundled in the package.
includeDependencies = {dependencies}:
lib.optionalString (dependencies != []) (
''
mkdir -p node_modules
cd node_modules
''
+ (lib.concatMapStrings (dependency:
''
if [ ! -e "${dependency.packageName}" ]; then
${composePackage dependency}
fi
''
) dependencies)
+ ''
cd ..
''
);
# Recursively composes the dependencies of a package
composePackage = { name, packageName, src, dependencies ? [], ... }@args:
builtins.addErrorContext "while evaluating node package '${packageName}'" ''
installPackage "${packageName}" "${src}"
${includeDependencies { inherit dependencies; }}
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
'';
pinpointDependencies = {dependencies, production}:
let
pinpointDependenciesFromPackageJSON = writeTextFile {
name = "pinpointDependencies.js";
text = ''
var fs = require('fs');
var path = require('path');
function resolveDependencyVersion(location, name) {
if(location == process.env['NIX_STORE']) {
return null;
} else {
var dependencyPackageJSON = path.join(location, "node_modules", name, "package.json");
if(fs.existsSync(dependencyPackageJSON)) {
var dependencyPackageObj = JSON.parse(fs.readFileSync(dependencyPackageJSON));
if(dependencyPackageObj.name == name) {
return dependencyPackageObj.version;
}
} else {
return resolveDependencyVersion(path.resolve(location, ".."), name);
}
}
}
function replaceDependencies(dependencies) {
if(typeof dependencies == "object" && dependencies !== null) {
for(var dependency in dependencies) {
var resolvedVersion = resolveDependencyVersion(process.cwd(), dependency);
if(resolvedVersion === null) {
process.stderr.write("WARNING: cannot pinpoint dependency: "+dependency+", context: "+process.cwd()+"\n");
} else {
dependencies[dependency] = resolvedVersion;
}
}
}
}
/* Read the package.json configuration */
var packageObj = JSON.parse(fs.readFileSync('./package.json'));
/* Pinpoint all dependencies */
replaceDependencies(packageObj.dependencies);
if(process.argv[2] == "development") {
replaceDependencies(packageObj.devDependencies);
}
else {
packageObj.devDependencies = {};
}
replaceDependencies(packageObj.optionalDependencies);
replaceDependencies(packageObj.peerDependencies);
/* Write the fixed package.json file */
fs.writeFileSync("package.json", JSON.stringify(packageObj, null, 2));
'';
};
in
''
node ${pinpointDependenciesFromPackageJSON} ${if production then "production" else "development"}
${lib.optionalString (dependencies != [])
''
if [ -d node_modules ]
then
cd node_modules
${lib.concatMapStrings (dependency: pinpointDependenciesOfPackage dependency) dependencies}
cd ..
fi
''}
'';
# Recursively traverses all dependencies of a package and pinpoints all
# dependencies in the package.json file to the versions that are actually
# being used.
pinpointDependenciesOfPackage = { packageName, dependencies ? [], production ? true, ... }@args:
''
if [ -d "${packageName}" ]
then
cd "${packageName}"
${pinpointDependencies { inherit dependencies production; }}
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
fi
'';
# Extract the Node.js source code which is used to compile packages with
# native bindings
nodeSources = runCommand "node-sources" {} ''
tar --no-same-owner --no-same-permissions -xf ${nodejs.src}
mv node-* $out
'';
# Script that adds _integrity fields to all package.json files to prevent NPM from consulting the cache (that is empty)
addIntegrityFieldsScript = writeTextFile {
name = "addintegrityfields.js";
text = ''
var fs = require('fs');
var path = require('path');
function augmentDependencies(baseDir, dependencies) {
for(var dependencyName in dependencies) {
var dependency = dependencies[dependencyName];
// Open package.json and augment metadata fields
var packageJSONDir = path.join(baseDir, "node_modules", dependencyName);
var packageJSONPath = path.join(packageJSONDir, "package.json");
if(fs.existsSync(packageJSONPath)) { // Only augment packages that exist. Sometimes we may have production installs in which development dependencies can be ignored
console.log("Adding metadata fields to: "+packageJSONPath);
var packageObj = JSON.parse(fs.readFileSync(packageJSONPath));
if(dependency.integrity) {
packageObj["_integrity"] = dependency.integrity;
} else {
packageObj["_integrity"] = "sha1-000000000000000000000000000="; // When no _integrity string has been provided (e.g. by Git dependencies), add a dummy one. It does not seem to harm and it bypasses downloads.
}
if(dependency.resolved) {
packageObj["_resolved"] = dependency.resolved; // Adopt the resolved property if one has been provided
} else {
packageObj["_resolved"] = dependency.version; // Set the resolved version to the version identifier. This prevents NPM from cloning Git repositories.
}
if(dependency.from !== undefined) { // Adopt from property if one has been provided
packageObj["_from"] = dependency.from;
}
fs.writeFileSync(packageJSONPath, JSON.stringify(packageObj, null, 2));
}
// Augment transitive dependencies
if(dependency.dependencies !== undefined) {
augmentDependencies(packageJSONDir, dependency.dependencies);
}
}
}
if(fs.existsSync("./package-lock.json")) {
var packageLock = JSON.parse(fs.readFileSync("./package-lock.json"));
if(![1, 2].includes(packageLock.lockfileVersion)) {
process.stderr.write("Sorry, I only understand lock file versions 1 and 2!\n");
process.exit(1);
}
if(packageLock.dependencies !== undefined) {
augmentDependencies(".", packageLock.dependencies);
}
}
'';
};
# Reconstructs a package-lock file from the node_modules/ folder structure and package.json files with dummy sha1 hashes
reconstructPackageLock = writeTextFile {
name = "reconstructpackagelock.js";
text = ''
var fs = require('fs');
var path = require('path');
var packageObj = JSON.parse(fs.readFileSync("package.json"));
var lockObj = {
name: packageObj.name,
version: packageObj.version,
lockfileVersion: 2,
requires: true,
packages: {
"": {
name: packageObj.name,
version: packageObj.version,
license: packageObj.license,
bin: packageObj.bin,
dependencies: packageObj.dependencies,
engines: packageObj.engines,
optionalDependencies: packageObj.optionalDependencies
}
},
dependencies: {}
};
function augmentPackageJSON(filePath, packages, dependencies) {
var packageJSON = path.join(filePath, "package.json");
if(fs.existsSync(packageJSON)) {
var packageObj = JSON.parse(fs.readFileSync(packageJSON));
packages[filePath] = {
version: packageObj.version,
integrity: "sha1-000000000000000000000000000=",
dependencies: packageObj.dependencies,
engines: packageObj.engines,
optionalDependencies: packageObj.optionalDependencies
};
dependencies[packageObj.name] = {
version: packageObj.version,
integrity: "sha1-000000000000000000000000000=",
dependencies: {}
};
processDependencies(path.join(filePath, "node_modules"), packages, dependencies[packageObj.name].dependencies);
}
}
function processDependencies(dir, packages, dependencies) {
if(fs.existsSync(dir)) {
var files = fs.readdirSync(dir);
files.forEach(function(entry) {
var filePath = path.join(dir, entry);
var stats = fs.statSync(filePath);
if(stats.isDirectory()) {
if(entry.substr(0, 1) == "@") {
// When we encounter a namespace folder, augment all packages belonging to the scope
var pkgFiles = fs.readdirSync(filePath);
pkgFiles.forEach(function(entry) {
if(stats.isDirectory()) {
var pkgFilePath = path.join(filePath, entry);
augmentPackageJSON(pkgFilePath, packages, dependencies);
}
});
} else {
augmentPackageJSON(filePath, packages, dependencies);
}
}
});
}
}
processDependencies("node_modules", lockObj.packages, lockObj.dependencies);
fs.writeFileSync("package-lock.json", JSON.stringify(lockObj, null, 2));
'';
};
# Script that links bins defined in package.json to the node_modules bin directory
# NPM does not do this for top-level packages itself anymore as of v7
linkBinsScript = writeTextFile {
name = "linkbins.js";
text = ''
var fs = require('fs');
var path = require('path');
var packageObj = JSON.parse(fs.readFileSync("package.json"));
var nodeModules = Array(packageObj.name.split("/").length).fill("..").join(path.sep);
if(packageObj.bin !== undefined) {
fs.mkdirSync(path.join(nodeModules, ".bin"))
if(typeof packageObj.bin == "object") {
Object.keys(packageObj.bin).forEach(function(exe) {
if(fs.existsSync(packageObj.bin[exe])) {
console.log("linking bin '" + exe + "'");
fs.symlinkSync(
path.join("..", packageObj.name, packageObj.bin[exe]),
path.join(nodeModules, ".bin", exe)
);
}
else {
console.log("skipping non-existent bin '" + exe + "'");
}
})
}
else {
if(fs.existsSync(packageObj.bin)) {
console.log("linking bin '" + packageObj.bin + "'");
fs.symlinkSync(
path.join("..", packageObj.name, packageObj.bin),
path.join(nodeModules, ".bin", packageObj.name.split("/").pop())
);
}
else {
console.log("skipping non-existent bin '" + packageObj.bin + "'");
}
}
}
else if(packageObj.directories !== undefined && packageObj.directories.bin !== undefined) {
fs.mkdirSync(path.join(nodeModules, ".bin"))
fs.readdirSync(packageObj.directories.bin).forEach(function(exe) {
if(fs.existsSync(path.join(packageObj.directories.bin, exe))) {
console.log("linking bin '" + exe + "'");
fs.symlinkSync(
path.join("..", packageObj.name, packageObj.directories.bin, exe),
path.join(nodeModules, ".bin", exe)
);
}
else {
console.log("skipping non-existent bin '" + exe + "'");
}
})
}
'';
};
prepareAndInvokeNPM = {packageName, bypassCache, reconstructLock, npmFlags, production}:
let
forceOfflineFlag = if bypassCache then "--offline" else "--registry http://www.example.com";
in
''
# Pinpoint the versions of all dependencies to the ones that are actually being used
echo "pinpointing versions of dependencies..."
source $pinpointDependenciesScriptPath
# Patch the shebangs of the bundled modules to prevent them from
# calling executables outside the Nix store as much as possible
patchShebangs .
# Deploy the Node.js package by running npm install. Since the
# dependencies have been provided already by ourselves, it should not
# attempt to install them again, which is good, because we want to make
# it Nix's responsibility. If it needs to install any dependencies
# anyway (e.g. because the dependency parameters are
# incomplete/incorrect), it fails.
#
# The other responsibilities of NPM are kept -- version checks, build
# steps, postprocessing etc.
export HOME=$TMPDIR
cd "${packageName}"
runHook preRebuild
${lib.optionalString bypassCache ''
${lib.optionalString reconstructLock ''
if [ -f package-lock.json ]
then
echo "WARNING: Reconstruct lock option enabled, but a lock file already exists!"
echo "This will most likely result in version mismatches! We will remove the lock file and regenerate it!"
rm package-lock.json
else
echo "No package-lock.json file found, reconstructing..."
fi
node ${reconstructPackageLock}
''}
node ${addIntegrityFieldsScript}
''}
npm ${forceOfflineFlag} --nodedir=${nodeSources} ${npmFlags} ${lib.optionalString production "--production"} rebuild
runHook postRebuild
if [ "''${dontNpmInstall-}" != "1" ]
then
# NPM tries to download packages even when they already exist if npm-shrinkwrap is used.
rm -f npm-shrinkwrap.json
npm ${forceOfflineFlag} --nodedir=${nodeSources} --no-bin-links --ignore-scripts ${npmFlags} ${lib.optionalString production "--production"} install
fi
# Link executables defined in package.json
node ${linkBinsScript}
'';
# Builds and composes an NPM package including all its dependencies
buildNodePackage =
{ name
, packageName
, version ? null
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, preRebuild ? ""
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, meta ? {}
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "preRebuild" "unpackPhase" "buildPhase" "meta" ];
in
stdenv.mkDerivation ({
name = "${name}${if version == null then "" else "-${version}"}";
buildInputs = [ tarWrapper python nodejs ]
++ lib.optional (stdenv.isLinux) utillinux
++ lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit nodejs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall preRebuild unpackPhase buildPhase;
compositionScript = composePackage args;
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "compositionScript" "pinpointDependenciesScript" ];
installPhase = ''
source ${installPackage}
# Create and enter a root node_modules/ folder
mkdir -p $out/lib/node_modules
cd $out/lib/node_modules
# Compose the package and all its dependencies
source $compositionScriptPath
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Create symlink to the deployed executable folder, if applicable
if [ -d "$out/lib/node_modules/.bin" ]
then
ln -s $out/lib/node_modules/.bin $out/bin
# Fixup all executables
ls $out/bin/* | while read i
do
file="$(readlink -f "$i")"
chmod u+rwx "$file"
if isScript "$file"
then
sed -i 's/\r$//' "$file" # convert crlf to lf
fi
done
fi
# Create symlinks to the deployed manual page folders, if applicable
if [ -d "$out/lib/node_modules/${packageName}/man" ]
then
mkdir -p $out/share
for dir in "$out/lib/node_modules/${packageName}/man/"*
do
mkdir -p $out/share/man/$(basename "$dir")
for page in "$dir"/*
do
ln -s $page $out/share/man/$(basename "$dir")
done
done
fi
# Run post install hook, if provided
runHook postInstall
'';
meta = {
# default to Node.js' platforms
platforms = nodejs.meta.platforms;
} // meta;
} // extraArgs);
# Builds a node environment (a node_modules folder and a set of binaries)
buildNodeDependencies =
{ name
, packageName
, version ? null
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" ];
in
stdenv.mkDerivation ({
name = "node-dependencies-${name}${if version == null then "" else "-${version}"}";
buildInputs = [ tarWrapper python nodejs ]
++ lib.optional (stdenv.isLinux) utillinux
++ lib.optional (stdenv.isDarwin) libtool
++ buildInputs;
inherit dontStrip; # Stripping may fail a build for some package deployments
inherit dontNpmInstall unpackPhase buildPhase;
includeScript = includeDependencies { inherit dependencies; };
pinpointDependenciesScript = pinpointDependenciesOfPackage args;
passAsFile = [ "includeScript" "pinpointDependenciesScript" ];
installPhase = ''
source ${installPackage}
mkdir -p $out/${packageName}
cd $out/${packageName}
source $includeScriptPath
# Create fake package.json to make the npm commands work properly
cp ${src}/package.json .
chmod 644 package.json
${lib.optionalString bypassCache ''
if [ -f ${src}/package-lock.json ]
then
cp ${src}/package-lock.json .
chmod 644 package-lock.json
fi
''}
# Go to the parent folder to make sure that all packages are pinpointed
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
${prepareAndInvokeNPM { inherit packageName bypassCache reconstructLock npmFlags production; }}
# Expose the executables that were installed
cd ..
${lib.optionalString (builtins.substring 0 1 packageName == "@") "cd .."}
mv ${packageName} lib
ln -s $out/lib/node_modules/.bin $out/bin
'';
} // extraArgs);
# Builds a development shell
buildNodeShell =
{ name
, packageName
, version ? null
, src
, dependencies ? []
, buildInputs ? []
, production ? true
, npmFlags ? ""
, dontNpmInstall ? false
, bypassCache ? false
, reconstructLock ? false
, dontStrip ? true
, unpackPhase ? "true"
, buildPhase ? "true"
, ... }@args:
let
nodeDependencies = buildNodeDependencies args;
extraArgs = removeAttrs args [ "name" "dependencies" "buildInputs" "dontStrip" "dontNpmInstall" "unpackPhase" "buildPhase" ];
in
stdenv.mkDerivation ({
name = "node-shell-${name}${if version == null then "" else "-${version}"}";
buildInputs = [ python nodejs ] ++ lib.optional (stdenv.isLinux) utillinux ++ buildInputs;
buildCommand = ''
mkdir -p $out/bin
cat > $out/bin/shell <<EOF
#! ${stdenv.shell} -e
$shellHook
exec ${stdenv.shell}
EOF
chmod +x $out/bin/shell
'';
# Provide the dependencies in a development shell through the NODE_PATH environment variable
inherit nodeDependencies;
shellHook = lib.optionalString (dependencies != []) ''
export NODE_PATH=${nodeDependencies}/lib/node_modules
export PATH="${nodeDependencies}/bin:$PATH"
'';
} // extraArgs);
in
{
buildNodeSourceDist = lib.makeOverridable buildNodeSourceDist;
buildNodePackage = lib.makeOverridable buildNodePackage;
buildNodeDependencies = lib.makeOverridable buildNodeDependencies;
buildNodeShell = lib.makeOverridable buildNodeShell;
}

2516
nix/node-package.nix Normal file

File diff suppressed because it is too large Load diff

1
node_modules Symbolic link
View file

@ -0,0 +1 @@
result/lib/node_modules/

View file

@ -16,9 +16,7 @@
"@tailwindcss/typography": "^0.5.13",
"autoprefixer": "^10.4.19",
"postcss": "^8.4.38",
"postcss-cli": "^11.0.0",
"postcss-minify": "^1.1.0",
"rollup": "^4.18.0",
"rollup-plugin-postcss": "^4.0.2",
"tailwindcss": "^3.4.3"
}

File diff suppressed because it is too large Load diff

View file

@ -1,13 +1,13 @@
import autoprefixer from "autoprefixer";
import postcss from "rollup-plugin-postcss";
import tailwindcss from "tailwindcss";
import postcssMinify from "postcss-minify";
import terser from "@rollup/plugin-terser";
import autoprefixer from "autoprefixer"
import postcss from "rollup-plugin-postcss"
import tailwindcss from "tailwindcss"
import postcssMinify from "postcss-minify"
import terser from "@rollup/plugin-terser"
export default {
input: "src/js/main.js",
output: {
file: "src/out/bundle.js",
file: "dist/out/bundle.js",
},
plugins: [
postcss({
@ -16,4 +16,4 @@ export default {
}),
terser(),
],
};
}

View file

@ -92,9 +92,9 @@
@apply text-center;
}
details {
@apply cursor-pointer;
@apply leading-loose sm:leading-[2] my-4 overflow-x-auto sm:text-lg font-light;
}
details summary {
@apply mb-1;
@apply mb-1 cursor-pointer;
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1 MiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 40 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 86 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 256 KiB

File diff suppressed because one or more lines are too long

View file

@ -1 +0,0 @@
const e=window.matchMedia("(prefers-color-scheme: dark)").matches,t=()=>{document.documentElement.classList.remove("dark")},s=()=>{document.documentElement.classList.add("dark")};let o="dark"===localStorage.getItem("theme")?2:"light"===localStorage.getItem("theme")?1:0;const a=document.getElementById("theme-toggle");a.addEventListener("click",(()=>{switch(o=(o+1)%3,o){case 0:localStorage.removeItem("theme"),e?document.documentElement.classList.add("dark"):document.documentElement.classList.remove("dark"),a.innerText="theme: system";break;case 1:e?(localStorage.setItem("theme","light"),t(),a.innerText="theme: light"):(localStorage.setItem("theme","dark"),s(),a.innerText="theme: dark");break;case 2:e?(localStorage.setItem("theme","dark"),s(),a.innerText="theme: dark"):(localStorage.setItem("theme","light"),t(),a.innerText="theme: light")}}));const n=()=>{document.body.classList.remove("font-sans"),document.body.classList.remove("font-serif")},m=e=>{e&&"serif"===e&&(n(),document.body.classList.add("font-serif")),e&&"sans"===e&&(n(),document.body.classList.add("font-sans")),e||n()};let c=localStorage.getItem("font");m();const l=document.getElementById("font-toggle");l.addEventListener("click",(()=>{c=localStorage.getItem("font"),"sans"===c?(c="serif",l.innerText="serif",localStorage.setItem("font","serif")):(c="sans",l.innerText="sans",localStorage.setItem("font","sans")),m(c)}));

View file

@ -0,0 +1,197 @@
---
author: "Youwen Wu"
authorTwitter: "@youwen"
desc: "and the future of operating systems"
image: "./images/gradient-ascent.jpg"
keywords: "nix, nixos, functional programming, linux, unix"
lang: "en"
title: "a retrospective on NixOS"
updated: "2024-05-25T12:00:00Z"
---
Many people more knowledgeable than me have already written at length about the
virtues of NixOS and _declarative configuration_ and _immutability_ and such. I
doubt what I have to say is particularly novel to those already familiar with
Nix, but I'd like to discuss precisely what brings people to NixOS in the first
place.
Many people will introduce NixOS by first introducing the Nix package manager,
and immediately jumping into terms like _derivation_ and _immutability_ and
_reproducibility_ and whatnot. And while these are important concepts for
understanding the system at large, it's not very convincing for people looking
to try out the system. After all, most people don't (or at least shouldn't!)
choose their tools based on hype or purported benefits, but based the problems
that they help them solve.
Instead of immediately evangelizing about the virtues of Nix and NixOS, I'll
first motivate the reasons for why I chose a tool with exactly its properties
(but not to worry, the evangelizing will come later).
Essentially: allow me to introduce you to the
origins of the [NixOS God
Complex](https://www.reddit.com/r/NixOS/comments/kauf1m/dealing_with_post_nixflake_god_complex/).
---
My goals for my system are as follows:
- Allow my computing environment to exist on different computers at the same
time (essentially, sync up configurations between machines)
- Precisely control the software and services on my machine. I should be able
to obtain binaries of most things to save time, but be able to step into the
source and apply patches or configuration as desired
- For the OS to be absolutely unbreakable
- Never configure the system twice; once I solve a problem, I should have a
reproducible solution that solves it permanently
- Be able to backup my system configuration and quickly redeploy it whenever
needed
- Avoid janky solutions to these problems that introduce tech debt. I don't
want to have to rely on disk images or backups, I want to be able to create
fresh installations quickly
Essentially, I want to synchronize the configuration of my entire system across
multiple machines while maintaining a stable and usable system I'm not worried
will inadvertently fall apart with a routine system update. When I tweak and
mess with some settings on my desktop, I should be able to push to a `git`
repository and pull it down on my laptop and have the tweaks carried over. This
even includes system-level configuration like the applications installed,
system daemons, and other core system services.
I want a source and binary based distribution simultaneously. And I want a
self-documenting reproducible system where every tiny tweak is
deterministically applied. And I want to be able to install my configurations
onto a new computer, from scratch, in an installer, effectively creating my own
custom Linux distribution.
Oh, and I also want to solve the "works on my machine" problem, and never have
trouble using software someone else packaged and claims works on their end, but
fails on my computer.
All or even just a few of these goals seem unattainable to the typical Linux
user (not to mention those still on Windows and macOS $\dots$ _oh, the
horror!_). But I was in fact able to achieve all of them.
---
To begin, let's examine how one might try to approach these problems with the
common solutions.
Let's talk about sharing configuration among multiple computers first, which
can be thought of as some form of "settings sync".
Most people have encountered solutions to sychronizing configuration in two
ways: either the entire service is ran in the cloud, so it's really the _same_
environment accessed from multiple places (eg. Google Docs), or it's some often
half baked opaque solution involving you making an account and sending all your
settings to a sync server (eg. Mozilla Firefox).
The more technically minded may instead opt to create a "dotfiles" repository,
holding their vast collection of meticulously crafted configuration files.
These repos often come with a janky `install.sh` script that does its best to
install all the files into the correct place. This usually works the first
time, but trying to keep the installed dotfiles in sync with a central
repository is a whole other problem.
There are also dotfile manager like `chezmoi` or GNU Stow. I have not tried
these so I make no judgements on their utility for their intended purpose.
These dotfile management solutions may work well for managing configuration
files, but they both have the same issue: you also need to install the software
you're configuring!
The software and the configuration are fundamentally tied together; these are
not concerns to be separated. If the software is installed, it almost always
needs to be configured anyways. If the configuration exists, the software
should be installed. So a sane solution needs to both put the configuration in
the right place, _and_ set up the system's programs along with all their
dependencies!
So, the most obsessive *nix hackers reach for tools like
[Ansible](https://www.ansible.com/), that promise automatic configuration of
entire systems. Though Ansible was initially designed to deploy cloud servers
quickly through the Infrastructure-as-Code approach, some people opt to use it
for deploying and managing their systems quickly as well. I have not personally
tried it beyond playing with a few examples The consensus seems to be that it
seems to work fine for simple use cases but gets quite unwieldy for more
complex purposes (especially for personal systems, which aren't expected to be
as ephemeral as servers).
A system like Ansible combined with a system to manage configuration files
might be able to achieve a few of our goals. We can keep configuration in sync
between computers and we can quickly redeploy our system. But anyone who has
tried this will tell you that it's incredibly uncomfortable to use; our
existing operating systems are simply not designed to be managed in this
manner. Inevitably you will experience desynchronization between the
configuration and the actual state of the machine.
Also, this does not solve some of our other problems. We'll still need tools
like Docker to reproducibly build software and figure out a way to keep our
system stable.
If you agree with the premises I've laid out up to this point, that none of
these solutions provide a satisfying solution to our computing woes, you might
come to the conclusion that I've made. We need a solution that does _all of
it_. A unified tool for reliably deploying software and managing your system
configuration. And it must necessarily be declarative and reproducible, because
that is the only sane way to manage a system. Imagine working on a programming
project where recompiling with the same source code would non-deterministically
produce different results based on the environment! We should be able to write
files that declaratively and precisely specify the state of whole system, and
then be able to revert these files or tweak them with deterministic results
that don't leave behind any broken programs or files.
Well, [Nix](https://nixos.org/) is the _purely functional_ package manager
(i.e. declarative, reproducible), and NixOS is a Linux distribution that is
managed entirely by Nix. Essentially, Nix provides a solution to the problem of
_software deployment_, and in fact was purpose built to do so in Eelco
Dolstra's seminal [PhD
thesis](https://edolstra.github.io/pubs/phd-thesis.pdf). It effectively solves
the problem of "works on my machine" by _forcing_ the user to actually specify
all required dependencies. This makes it a little harder to write the initial
build configurations due to the strictness imposed. But the reward is that if a
piece of software builds on one machine, it's guaranteed to build on another.
NixOS is a system that takes the power of Nix and applies it to declaratively
configure an _entire Linux system_. All of the installed software and activated
services can be specified precisely using the Nix expression language, a purely
functional DSL used by Nix. And alongside the software, it also configures it,
effectively acting as a dotfile manager. Indeed, many core NixOS services and a
wide range of programs can be set up through _NixOS modules_, where the program
is installed and configured in the same place. (and many programs like `fzf`,
`btop`, etc have similar corresponding `home-manager` modules).
NixOS is also _immutable_, which means that the system cannot be modified after
it is built from the Nix files that declare it. How do you make changes to the
system then? Obviously, we just create a new system where the changed programs
and files are included, and the old ones are removed. But they are not deleted
from the hard drive, they still exist in the _Nix store_. So, the system can
provide precise atomic rollbacks between each "generation" of itself. Broke
your GRUB configuration so your system won't boot? Messed up your kernel
settings? Just select an older working generation from the boot menu and you
instantly have a working system again. You never worry about breaking things
during either routine or massive system updates.
And because the system is fully declarative, and modifying the system is done
only through modifying its Nix configuration files, you can version and sync
them up with Git. This solves the problem of keeping system environments in
sync; now, you truly only have to keep one repository of all your configuration
in sync, and all the software installation and deployment is handled for you by
a system designed precisely for that purpose.
This makes it possible for me to share common configuration between a multitude
of entirely distinct machines, including an `x86_64` desktop, an `x86_64`
laptop, an Apple Silicon Macbook running NixOS `aarch64` using [Asahi
Linux](https://asahilinux.org/), and the same Macbook running macOS with
`nix-darwin`, sharing `home-manager` configuration with NixOS. Specific
configuration necessary to adjust hardware-specific details between each
machines are isolated to the [hosts](./hosts) directory.
This works exceptionally well, evidenced by the fact that I have (almost) the
exact same environment across three separate machines, spanning two entirely
distinct CPU architectures.
In essence, the primary failure of deployment scripts, Ansible and the like is
that they are _imperative_ - they must specify precisely _how_ to set up the
system, down to minute details, whereas in a _declarative_ approach, the user
can simply specify what the system _should look like_, and abstractions take
care of the _how_. This is what NixOS does, and it gives you remote syncing,
versioning (via `git`), and rollbacks _for free_.

View file

@ -115,8 +115,8 @@
<h1 class="text-4xl md:text-5xl font-serif font-medium">
<a
href="/"
class="dark:hover:text-muted-dark hover:text-muted-light transition-all duration-500 text-nowrap tracking-wide hover:tracking-wider"
>Gradient Ascent</a
class="dark:hover:text-muted-dark hover:text-muted-light transition-all duration-500 text-nowrap tracking-wide"
><em>Gradient Ascent.</em></a
>
</h1>
<div
@ -124,7 +124,7 @@
></div>
</div>
<p class="mt-8 mb-3 px-1 italic font-light">
a web-log about computers, math, hacks, games, and life.
a web-log about computers, math, hacks, and all the rest.
</p>
<a class="text-sm external-link" href="https://youwen.dev"
><em>by </em>Youwen Wu</a