day-one-to-hugo
A tool for converting posts from a Day One JSON export to posts for a Hugo site.
Installation
Linux
For Linux, both RPM and DEB packages are available, plus regular .tar.gz
files.
After installing, the command should be installed at /usr/local/bin/day-one-to-hugo
.
MacOS
The recommended way to install on MacOS is using Homebrew. To install, run the following commands:
brew tap lmika/day-one-to-hugo
brew install day-one-to-hugo
There's also a regular .tar.gz
file available.
Windows
Zips with the Windows binaries are available.
Go
If you have Go, you can install day-one-to-hugo using this command:
go install github.com/lmika/day-one-to-hugo@latest
Basic Usage
- Make a JSON export from Day One.
- Unzip it and locate the
.json
files of the journals you'd like to export.
- Create a new Hugo site.
- Run
day-one-to-hugo
, passing the .json
file of the journal you want to export, and setting -d
to the directory of the Hugo site:
$ day-one-to-hugo -d hugo-site Journal.json
This will write out all posts to <hugo-site>/posts
and all the media — images and videos — to <hugo-site>/static
.
Full Usage
$ day-one-hugo OPTIONS JSON ...
Options are:
--site, -d <dir>
— base directory of the Hugo site to export to. Default is out
--posts <dir>
— directory within the Hugo content
directory to write the posts to
(this is usually a property of the layout). Default is posts
.
--from, -f <date>
— export posts that occur on or after this date. Format is YYYY-MM-DD
--to, -t <date>
— export posts that occur before, but not including, this date. Format is YYYY-MM-DD
--dry-run, -n
— only print the date, and first line, of the posts that will be exported. Does not export anything.
--keep-exif
— keep EXIF metadata on images exported to Hugo. The default is that images are
exported to Hugo with EXIF tags stripped. Does not affect videos (EXIF tags are not modified).
JSON is one or more Journal .json
files to export.