Merge public-gh/master into paperclip-company-import-export
This commit is contained in:
@@ -1,7 +1,7 @@
|
||||
---
|
||||
name: release-changelog
|
||||
description: >
|
||||
Generate the stable Paperclip release changelog at releases/v{version}.md by
|
||||
Generate the stable Paperclip release changelog at releases/vYYYY.MDD.P.md by
|
||||
reading commits, changesets, and merged PR context since the last stable tag.
|
||||
---
|
||||
|
||||
@@ -9,20 +9,33 @@ description: >
|
||||
|
||||
Generate the user-facing changelog for the **stable** Paperclip release.
|
||||
|
||||
## Versioning Model
|
||||
|
||||
Paperclip uses **calendar versioning (calver)**:
|
||||
|
||||
- Stable releases: `YYYY.MDD.P` (e.g. `2026.318.0`)
|
||||
- Canary releases: `YYYY.MDD.P-canary.N` (e.g. `2026.318.1-canary.0`)
|
||||
- Git tags: `vYYYY.MDD.P` for stable, `canary/vYYYY.MDD.P-canary.N` for canary
|
||||
|
||||
There are no major/minor/patch bumps. The stable version is derived from the
|
||||
intended release date (UTC) plus the next same-day stable patch slot.
|
||||
|
||||
Output:
|
||||
|
||||
- `releases/v{version}.md`
|
||||
- `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Important rule:
|
||||
Important rules:
|
||||
|
||||
- even if there are canary releases such as `1.2.3-canary.0`, the changelog file stays `releases/v1.2.3.md`
|
||||
- even if there are canary releases such as `2026.318.1-canary.0`, the changelog file stays `releases/v2026.318.1.md`
|
||||
- do not derive versions from semver bump types
|
||||
- do not create canary changelog files
|
||||
|
||||
## Step 0 — Idempotency Check
|
||||
|
||||
Before generating anything, check whether the file already exists:
|
||||
|
||||
```bash
|
||||
ls releases/v{version}.md 2>/dev/null
|
||||
ls releases/vYYYY.MDD.P.md 2>/dev/null
|
||||
```
|
||||
|
||||
If it exists:
|
||||
@@ -41,13 +54,14 @@ git tag --list 'v*' --sort=-version:refname | head -1
|
||||
git log v{last}..HEAD --oneline --no-merges
|
||||
```
|
||||
|
||||
The planned stable version comes from one of:
|
||||
The stable version comes from one of:
|
||||
|
||||
- an explicit maintainer request
|
||||
- the chosen bump type applied to the last stable tag
|
||||
- `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
- the release plan already agreed in `doc/RELEASING.md`
|
||||
|
||||
Do not derive the changelog version from a canary tag or prerelease suffix.
|
||||
Do not derive major/minor/patch bumps from API intent — calver uses the date and same-day stable slot.
|
||||
|
||||
## Step 2 — Gather the Raw Inputs
|
||||
|
||||
@@ -73,7 +87,6 @@ Look for:
|
||||
- destructive migrations
|
||||
- removed or changed API fields/endpoints
|
||||
- renamed or removed config keys
|
||||
- `major` changesets
|
||||
- `BREAKING:` or `BREAKING CHANGE:` commit signals
|
||||
|
||||
Key commands:
|
||||
@@ -85,7 +98,8 @@ git diff v{last}..HEAD -- server/src/routes/ server/src/api/
|
||||
git log v{last}..HEAD --format="%s" | rg -n 'BREAKING CHANGE|BREAKING:|^[a-z]+!:' || true
|
||||
```
|
||||
|
||||
If the requested bump is lower than the minimum required bump, flag that before the release proceeds.
|
||||
If breaking changes are detected, flag them prominently — they must appear in the
|
||||
Breaking Changes section with an upgrade path.
|
||||
|
||||
## Step 4 — Categorize for Users
|
||||
|
||||
@@ -130,9 +144,9 @@ Rules:
|
||||
Template:
|
||||
|
||||
```markdown
|
||||
# v{version}
|
||||
# vYYYY.MDD.P
|
||||
|
||||
> Released: {YYYY-MM-DD}
|
||||
> Released: YYYY-MM-DD
|
||||
|
||||
## Breaking Changes
|
||||
|
||||
|
||||
@@ -2,23 +2,21 @@
|
||||
name: release
|
||||
description: >
|
||||
Coordinate a full Paperclip release across engineering verification, npm,
|
||||
GitHub, website publishing, and announcement follow-up. Use when leadership
|
||||
asks to ship a release, not merely to discuss version bumps.
|
||||
GitHub, smoke testing, and announcement follow-up. Use when leadership asks
|
||||
to ship a release, not merely to discuss versioning.
|
||||
---
|
||||
|
||||
# Release Coordination Skill
|
||||
|
||||
Run the full Paperclip release as a maintainer workflow, not just an npm publish.
|
||||
Run the full Paperclip maintainer release workflow, not just an npm publish.
|
||||
|
||||
This skill coordinates:
|
||||
|
||||
- stable changelog drafting via `release-changelog`
|
||||
- release-train setup via `scripts/release-start.sh`
|
||||
- prerelease canary publishing via `scripts/release.sh --canary`
|
||||
- canary verification and publish status from `master`
|
||||
- Docker smoke testing via `scripts/docker-onboard-smoke.sh`
|
||||
- stable publishing via `scripts/release.sh`
|
||||
- pushing the stable branch commit and tag
|
||||
- GitHub Release creation via `scripts/create-github-release.sh`
|
||||
- manual stable promotion from a chosen source ref
|
||||
- GitHub Release creation
|
||||
- website / announcement follow-up tasks
|
||||
|
||||
## Trigger
|
||||
@@ -26,8 +24,9 @@ This skill coordinates:
|
||||
Use this skill when leadership asks for:
|
||||
|
||||
- "do a release"
|
||||
- "ship the next patch/minor/major"
|
||||
- "release vX.Y.Z"
|
||||
- "ship the release"
|
||||
- "promote this canary to stable"
|
||||
- "cut the stable release"
|
||||
|
||||
## Preconditions
|
||||
|
||||
@@ -35,10 +34,10 @@ Before proceeding, verify all of the following:
|
||||
|
||||
1. `.agents/skills/release-changelog/SKILL.md` exists and is usable.
|
||||
2. The repo working tree is clean, including untracked files.
|
||||
3. There are commits since the last stable tag.
|
||||
4. The release SHA has passed the verification gate or is about to.
|
||||
5. If package manifests changed, the CI-owned `pnpm-lock.yaml` refresh is already merged on `master` before the release branch is cut.
|
||||
6. npm publish rights are available locally, or the GitHub release workflow is being used with trusted publishing.
|
||||
3. There is at least one canary or candidate commit since the last stable tag.
|
||||
4. The candidate SHA has passed the verification gate or is about to.
|
||||
5. If manifests changed, the CI-owned `pnpm-lock.yaml` refresh is already merged on `master`.
|
||||
6. npm publish rights are available through GitHub trusted publishing, or through local npm auth for emergency/manual use.
|
||||
7. If running through Paperclip, you have issue context for status updates and follow-up task creation.
|
||||
|
||||
If any precondition fails, stop and report the blocker.
|
||||
@@ -47,78 +46,67 @@ If any precondition fails, stop and report the blocker.
|
||||
|
||||
Collect these inputs up front:
|
||||
|
||||
- requested bump: `patch`, `minor`, or `major`
|
||||
- whether this run is a dry run or live release
|
||||
- whether the release is being run locally or from GitHub Actions
|
||||
- whether the target is a canary check or a stable promotion
|
||||
- the candidate `source_ref` for stable
|
||||
- whether the stable run is dry-run or live
|
||||
- release issue / company context for website and announcement follow-up
|
||||
|
||||
## Step 0 — Release Model
|
||||
|
||||
Paperclip now uses this release model:
|
||||
Paperclip now uses a commit-driven release model:
|
||||
|
||||
1. Start or resume `release/X.Y.Z`
|
||||
2. Draft the **stable** changelog as `releases/vX.Y.Z.md`
|
||||
3. Publish one or more **prerelease canaries** such as `X.Y.Z-canary.0`
|
||||
4. Smoke test the canary via Docker
|
||||
5. Publish the stable version `X.Y.Z`
|
||||
6. Push the stable branch commit and tag
|
||||
7. Create the GitHub Release
|
||||
8. Merge `release/X.Y.Z` back to `master` without squash or rebase
|
||||
9. Complete website and announcement surfaces
|
||||
1. every push to `master` publishes a canary automatically
|
||||
2. canaries use `YYYY.MDD.P-canary.N`
|
||||
3. stable releases use `YYYY.MDD.P`
|
||||
4. the middle slot is `MDD`, where `M` is the UTC month and `DD` is the zero-padded UTC day
|
||||
5. the stable patch slot increments when more than one stable ships on the same UTC date
|
||||
6. stable releases are manually promoted from a chosen tested commit or canary source commit
|
||||
7. only stable releases get `releases/vYYYY.MDD.P.md`, git tag `vYYYY.MDD.P`, and a GitHub Release
|
||||
|
||||
Critical consequence:
|
||||
Critical consequences:
|
||||
|
||||
- Canaries do **not** use promote-by-dist-tag anymore.
|
||||
- The changelog remains stable-only. Do not create `releases/vX.Y.Z-canary.N.md`.
|
||||
- do not use release branches as the default path
|
||||
- do not derive major/minor/patch bumps
|
||||
- do not create canary changelog files
|
||||
- do not create canary GitHub Releases
|
||||
|
||||
## Step 1 — Decide the Stable Version
|
||||
## Step 1 — Choose the Candidate
|
||||
|
||||
Start the release train first:
|
||||
For canary validation:
|
||||
|
||||
- inspect the latest successful canary run on `master`
|
||||
- record the canary version and source SHA
|
||||
|
||||
For stable promotion:
|
||||
|
||||
1. choose the tested source ref
|
||||
2. confirm it is the exact SHA you want to promote
|
||||
3. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
|
||||
Useful commands:
|
||||
|
||||
```bash
|
||||
./scripts/release-start.sh {patch|minor|major}
|
||||
git tag --list 'v*' --sort=-version:refname | head -1
|
||||
git log --oneline --no-merges
|
||||
npm view paperclipai@canary version
|
||||
```
|
||||
|
||||
Then run release preflight:
|
||||
|
||||
```bash
|
||||
./scripts/release-preflight.sh canary {patch|minor|major}
|
||||
# or
|
||||
./scripts/release-preflight.sh stable {patch|minor|major}
|
||||
```
|
||||
|
||||
Then use the last stable tag as the base:
|
||||
|
||||
```bash
|
||||
LAST_TAG=$(git tag --list 'v*' --sort=-version:refname | head -1)
|
||||
git log "${LAST_TAG}..HEAD" --oneline --no-merges
|
||||
git diff --name-only "${LAST_TAG}..HEAD" -- packages/db/src/migrations/
|
||||
git diff "${LAST_TAG}..HEAD" -- packages/db/src/schema/
|
||||
git log "${LAST_TAG}..HEAD" --format="%s" | rg -n 'BREAKING CHANGE|BREAKING:|^[a-z]+!:' || true
|
||||
```
|
||||
|
||||
Bump policy:
|
||||
|
||||
- destructive migrations, removed APIs, breaking config changes -> `major`
|
||||
- additive migrations or clearly user-visible features -> at least `minor`
|
||||
- fixes only -> `patch`
|
||||
|
||||
If the requested bump is too low, escalate it and explain why.
|
||||
|
||||
## Step 2 — Draft the Stable Changelog
|
||||
|
||||
Invoke `release-changelog` and generate:
|
||||
Stable changelog files live at:
|
||||
|
||||
- `releases/vX.Y.Z.md`
|
||||
- `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Invoke `release-changelog` and generate or update the stable notes only.
|
||||
|
||||
Rules:
|
||||
|
||||
- review the draft with a human before publish
|
||||
- preserve manual edits if the file already exists
|
||||
- keep the heading and filename stable-only, for example `v1.2.3`
|
||||
- do not create a separate canary changelog file
|
||||
- keep the filename stable-only
|
||||
- do not create a canary changelog file
|
||||
|
||||
## Step 3 — Verify the Release SHA
|
||||
## Step 3 — Verify the Candidate SHA
|
||||
|
||||
Run the standard gate:
|
||||
|
||||
@@ -128,41 +116,27 @@ pnpm test:run
|
||||
pnpm build
|
||||
```
|
||||
|
||||
If the release will be run through GitHub Actions, the workflow can rerun this gate. Still report whether the local tree currently passes.
|
||||
If the GitHub release workflow will run the publish, it can rerun this gate. Still report local status if you checked it.
|
||||
|
||||
The GitHub Actions release workflow installs with `pnpm install --frozen-lockfile`. Treat that as a release invariant, not a nuisance: if manifests changed and the lockfile refresh PR has not landed yet, stop and wait for `master` to contain the committed lockfile before shipping.
|
||||
For PRs that touch release logic, the repo also runs a canary release dry-run in CI. That is a release-specific guard, not a substitute for the standard gate.
|
||||
|
||||
## Step 4 — Publish a Canary
|
||||
## Step 4 — Validate the Canary
|
||||
|
||||
Run from the `release/X.Y.Z` branch:
|
||||
The normal canary path is automatic from `master` via:
|
||||
|
||||
```bash
|
||||
./scripts/release.sh {patch|minor|major} --canary --dry-run
|
||||
./scripts/release.sh {patch|minor|major} --canary
|
||||
```
|
||||
- `.github/workflows/release.yml`
|
||||
|
||||
What this means:
|
||||
Confirm:
|
||||
|
||||
- npm receives `X.Y.Z-canary.N` under dist-tag `canary`
|
||||
- `latest` remains unchanged
|
||||
- no git tag is created
|
||||
- the script cleans the working tree afterward
|
||||
1. verification passed
|
||||
2. npm canary publish succeeded
|
||||
3. git tag `canary/vYYYY.MDD.P-canary.N` exists
|
||||
|
||||
Guard:
|
||||
|
||||
- if the current stable is `0.2.7`, the next patch canary is `0.2.8-canary.0`
|
||||
- the tooling must never publish `0.2.7-canary.N` after `0.2.7` is already stable
|
||||
|
||||
After publish, verify:
|
||||
Useful checks:
|
||||
|
||||
```bash
|
||||
npm view paperclipai@canary version
|
||||
```
|
||||
|
||||
The user install path is:
|
||||
|
||||
```bash
|
||||
npx paperclipai@canary onboard
|
||||
git tag --list 'canary/v*' --sort=-version:refname | head -5
|
||||
```
|
||||
|
||||
## Step 5 — Smoke Test the Canary
|
||||
@@ -173,60 +147,70 @@ Run:
|
||||
PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Useful isolated variant:
|
||||
|
||||
```bash
|
||||
HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Confirm:
|
||||
|
||||
1. install succeeds
|
||||
2. onboarding completes
|
||||
3. server boots
|
||||
4. UI loads
|
||||
5. basic company/dashboard flow works
|
||||
2. onboarding completes without crashes
|
||||
3. the server boots
|
||||
4. the UI loads
|
||||
5. basic company creation and dashboard load work
|
||||
|
||||
If smoke testing fails:
|
||||
|
||||
- stop the stable release
|
||||
- fix the issue
|
||||
- publish another canary
|
||||
- repeat the smoke test
|
||||
- fix the issue on `master`
|
||||
- wait for the next automatic canary
|
||||
- rerun smoke testing
|
||||
|
||||
Each retry should create a higher canary ordinal, while the stable target version can stay the same.
|
||||
## Step 6 — Preview or Publish Stable
|
||||
|
||||
## Step 6 — Publish Stable
|
||||
The normal stable path is manual `workflow_dispatch` on:
|
||||
|
||||
Once the SHA is vetted, run:
|
||||
- `.github/workflows/release.yml`
|
||||
|
||||
Inputs:
|
||||
|
||||
- `source_ref`
|
||||
- `stable_date`
|
||||
- `dry_run`
|
||||
|
||||
Before live stable:
|
||||
|
||||
1. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
2. ensure `releases/vYYYY.MDD.P.md` exists on the source ref
|
||||
3. run the stable workflow in dry-run mode first when practical
|
||||
4. then run the real stable publish
|
||||
|
||||
The stable workflow:
|
||||
|
||||
- re-verifies the exact source ref
|
||||
- computes the next stable patch slot for the chosen UTC date
|
||||
- publishes `YYYY.MDD.P` under dist-tag `latest`
|
||||
- creates git tag `vYYYY.MDD.P`
|
||||
- creates or updates the GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Local emergency/manual commands:
|
||||
|
||||
```bash
|
||||
./scripts/release.sh {patch|minor|major} --dry-run
|
||||
./scripts/release.sh {patch|minor|major}
|
||||
./scripts/release.sh stable --dry-run
|
||||
./scripts/release.sh stable
|
||||
git push public-gh refs/tags/vYYYY.MDD.P
|
||||
./scripts/create-github-release.sh YYYY.MDD.P
|
||||
```
|
||||
|
||||
Stable publish does this:
|
||||
|
||||
- publishes `X.Y.Z` to npm under `latest`
|
||||
- creates the local release commit
|
||||
- creates the local git tag `vX.Y.Z`
|
||||
|
||||
Stable publish does **not** push the release for you.
|
||||
|
||||
## Step 7 — Push and Create GitHub Release
|
||||
|
||||
After stable publish succeeds:
|
||||
|
||||
```bash
|
||||
git push public-gh HEAD --follow-tags
|
||||
./scripts/create-github-release.sh X.Y.Z
|
||||
```
|
||||
|
||||
Use the stable changelog file as the GitHub Release notes source.
|
||||
|
||||
Then open the PR from `release/X.Y.Z` back to `master` and merge without squash or rebase.
|
||||
|
||||
## Step 8 — Finish the Other Surfaces
|
||||
## Step 7 — Finish the Other Surfaces
|
||||
|
||||
Create or verify follow-up work for:
|
||||
|
||||
- website changelog publishing
|
||||
- launch post / social announcement
|
||||
- any release summary in Paperclip issue context
|
||||
- release summary in Paperclip issue context
|
||||
|
||||
These should reference the stable release, not the canary.
|
||||
|
||||
@@ -236,9 +220,9 @@ If the canary is bad:
|
||||
|
||||
- publish another canary, do not ship stable
|
||||
|
||||
If stable npm publish succeeds but push or GitHub release creation fails:
|
||||
If stable npm publish succeeds but tag push or GitHub release creation fails:
|
||||
|
||||
- fix the git/GitHub issue immediately from the same checkout
|
||||
- fix the git/GitHub issue immediately from the same release result
|
||||
- do not republish the same version
|
||||
|
||||
If `latest` is bad after stable publish:
|
||||
@@ -247,15 +231,17 @@ If `latest` is bad after stable publish:
|
||||
./scripts/rollback-latest.sh <last-good-version>
|
||||
```
|
||||
|
||||
Then fix forward with a new patch release.
|
||||
Then fix forward with a new stable release.
|
||||
|
||||
## Output
|
||||
|
||||
When the skill completes, provide:
|
||||
|
||||
- stable version and, if relevant, the final canary version tested
|
||||
- candidate SHA and tested canary version, if relevant
|
||||
- stable version, if promoted
|
||||
- verification status
|
||||
- npm status
|
||||
- smoke-test status
|
||||
- git tag / GitHub Release status
|
||||
- website / announcement follow-up status
|
||||
- rollback recommendation if anything is still partially complete
|
||||
|
||||
@@ -1,8 +0,0 @@
|
||||
# Changesets
|
||||
|
||||
Hello and welcome! This folder has been automatically generated by `@changesets/cli`, a build tool that works
|
||||
with multi-package repos, or single-package repos to help you version and publish your code. You can
|
||||
find the full documentation for it [in our repository](https://github.com/changesets/changesets).
|
||||
|
||||
We have a quick list of common questions to get you started engaging with this project in
|
||||
[our documentation](https://github.com/changesets/changesets/blob/main/docs/common-questions.md).
|
||||
@@ -1,11 +0,0 @@
|
||||
{
|
||||
"$schema": "https://unpkg.com/@changesets/config@3.1.3/schema.json",
|
||||
"changelog": "@changesets/cli/changelog",
|
||||
"commit": false,
|
||||
"fixed": [["@paperclipai/*", "paperclipai"]],
|
||||
"linked": [],
|
||||
"access": "public",
|
||||
"baseBranch": "master",
|
||||
"updateInternalDependencies": "patch",
|
||||
"ignore": ["@paperclipai/ui"]
|
||||
}
|
||||
10
.github/CODEOWNERS
vendored
Normal file
10
.github/CODEOWNERS
vendored
Normal file
@@ -0,0 +1,10 @@
|
||||
# Replace @cryppadotta if a different maintainer or team should own release infrastructure.
|
||||
|
||||
.github/** @cryppadotta @devinfoley
|
||||
scripts/release*.sh @cryppadotta @devinfoley
|
||||
scripts/release-*.mjs @cryppadotta @devinfoley
|
||||
scripts/create-github-release.sh @cryppadotta @devinfoley
|
||||
scripts/rollback-latest.sh @cryppadotta @devinfoley
|
||||
doc/RELEASING.md @cryppadotta @devinfoley
|
||||
doc/PUBLISHING.md @cryppadotta @devinfoley
|
||||
doc/RELEASE-AUTOMATION-SETUP.md @cryppadotta @devinfoley
|
||||
8
.github/workflows/pr-verify.yml
vendored
8
.github/workflows/pr-verify.yml
vendored
@@ -26,7 +26,7 @@ jobs:
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 20
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
@@ -40,3 +40,9 @@ jobs:
|
||||
|
||||
- name: Build
|
||||
run: pnpm build
|
||||
|
||||
- name: Release canary dry run
|
||||
run: |
|
||||
git checkout -B master HEAD
|
||||
git checkout -- pnpm-lock.yaml
|
||||
./scripts/release.sh canary --skip-verify --dry-run
|
||||
|
||||
12
.github/workflows/refresh-lockfile.yml
vendored
12
.github/workflows/refresh-lockfile.yml
vendored
@@ -79,3 +79,15 @@ jobs:
|
||||
else
|
||||
echo "PR #$existing already exists, branch updated via force push."
|
||||
fi
|
||||
|
||||
- name: Enable auto-merge for lockfile PR
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
run: |
|
||||
pr_url="$(gh pr list --head chore/refresh-lockfile --json url --jq '.[0].url')"
|
||||
if [ -z "$pr_url" ]; then
|
||||
echo "Error: lockfile PR was not found." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
gh pr merge --auto --squash --delete-branch "$pr_url"
|
||||
|
||||
118
.github/workflows/release-smoke.yml
vendored
Normal file
118
.github/workflows/release-smoke.yml
vendored
Normal file
@@ -0,0 +1,118 @@
|
||||
name: Release Smoke
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
paperclip_version:
|
||||
description: Published Paperclip dist-tag to test
|
||||
required: true
|
||||
default: canary
|
||||
type: choice
|
||||
options:
|
||||
- canary
|
||||
- latest
|
||||
host_port:
|
||||
description: Host port for the Docker smoke container
|
||||
required: false
|
||||
default: "3232"
|
||||
type: string
|
||||
artifact_name:
|
||||
description: Artifact name for uploaded diagnostics
|
||||
required: false
|
||||
default: release-smoke
|
||||
type: string
|
||||
workflow_call:
|
||||
inputs:
|
||||
paperclip_version:
|
||||
required: true
|
||||
type: string
|
||||
host_port:
|
||||
required: false
|
||||
default: "3232"
|
||||
type: string
|
||||
artifact_name:
|
||||
required: false
|
||||
default: release-smoke
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
smoke:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 45
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Install Playwright browser
|
||||
run: npx playwright install --with-deps chromium
|
||||
|
||||
- name: Launch Docker smoke harness
|
||||
run: |
|
||||
metadata_file="$RUNNER_TEMP/release-smoke.env"
|
||||
HOST_PORT="${{ inputs.host_port }}" \
|
||||
DATA_DIR="$RUNNER_TEMP/release-smoke-data" \
|
||||
PAPERCLIPAI_VERSION="${{ inputs.paperclip_version }}" \
|
||||
SMOKE_DETACH=true \
|
||||
SMOKE_METADATA_FILE="$metadata_file" \
|
||||
./scripts/docker-onboard-smoke.sh
|
||||
set -a
|
||||
source "$metadata_file"
|
||||
set +a
|
||||
{
|
||||
echo "SMOKE_BASE_URL=$SMOKE_BASE_URL"
|
||||
echo "SMOKE_ADMIN_EMAIL=$SMOKE_ADMIN_EMAIL"
|
||||
echo "SMOKE_ADMIN_PASSWORD=$SMOKE_ADMIN_PASSWORD"
|
||||
echo "SMOKE_CONTAINER_NAME=$SMOKE_CONTAINER_NAME"
|
||||
echo "SMOKE_DATA_DIR=$SMOKE_DATA_DIR"
|
||||
echo "SMOKE_IMAGE_NAME=$SMOKE_IMAGE_NAME"
|
||||
echo "SMOKE_PAPERCLIPAI_VERSION=$SMOKE_PAPERCLIPAI_VERSION"
|
||||
echo "SMOKE_METADATA_FILE=$metadata_file"
|
||||
} >> "$GITHUB_ENV"
|
||||
|
||||
- name: Run release smoke Playwright suite
|
||||
env:
|
||||
PAPERCLIP_RELEASE_SMOKE_BASE_URL: ${{ env.SMOKE_BASE_URL }}
|
||||
PAPERCLIP_RELEASE_SMOKE_EMAIL: ${{ env.SMOKE_ADMIN_EMAIL }}
|
||||
PAPERCLIP_RELEASE_SMOKE_PASSWORD: ${{ env.SMOKE_ADMIN_PASSWORD }}
|
||||
run: pnpm run test:release-smoke
|
||||
|
||||
- name: Capture Docker logs
|
||||
if: always()
|
||||
run: |
|
||||
if [[ -n "${SMOKE_CONTAINER_NAME:-}" ]]; then
|
||||
docker logs "$SMOKE_CONTAINER_NAME" >"$RUNNER_TEMP/docker-onboard-smoke.log" 2>&1 || true
|
||||
fi
|
||||
|
||||
- name: Upload diagnostics
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: ${{ inputs.artifact_name }}
|
||||
path: |
|
||||
${{ runner.temp }}/docker-onboard-smoke.log
|
||||
${{ env.SMOKE_METADATA_FILE }}
|
||||
tests/release-smoke/playwright-report/
|
||||
tests/release-smoke/test-results/
|
||||
retention-days: 14
|
||||
|
||||
- name: Stop Docker smoke container
|
||||
if: always()
|
||||
run: |
|
||||
if [[ -n "${SMOKE_CONTAINER_NAME:-}" ]]; then
|
||||
docker rm -f "$SMOKE_CONTAINER_NAME" >/dev/null 2>&1 || true
|
||||
fi
|
||||
205
.github/workflows/release.yml
vendored
205
.github/workflows/release.yml
vendored
@@ -1,38 +1,33 @@
|
||||
name: Release
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
channel:
|
||||
description: Release channel
|
||||
source_ref:
|
||||
description: Commit SHA, branch, or tag to publish as stable
|
||||
required: true
|
||||
type: choice
|
||||
default: canary
|
||||
options:
|
||||
- canary
|
||||
- stable
|
||||
bump:
|
||||
description: Semantic version bump
|
||||
required: true
|
||||
type: choice
|
||||
default: patch
|
||||
options:
|
||||
- patch
|
||||
- minor
|
||||
- major
|
||||
type: string
|
||||
default: master
|
||||
stable_date:
|
||||
description: Enter a UTC date in YYYY-MM-DD format, for example 2026-03-18. Do not enter a version string. The workflow will resolve that date to a stable version such as 2026.318.0, then 2026.318.1 for the next same-day stable.
|
||||
required: false
|
||||
type: string
|
||||
dry_run:
|
||||
description: Preview the release without publishing
|
||||
description: Preview the stable release without publishing
|
||||
required: true
|
||||
type: boolean
|
||||
default: true
|
||||
default: false
|
||||
|
||||
concurrency:
|
||||
group: release-${{ github.ref }}
|
||||
group: release-${{ github.event_name }}-${{ github.ref }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
verify:
|
||||
if: startsWith(github.ref, 'refs/heads/release/')
|
||||
verify_canary:
|
||||
if: github.event_name == 'push'
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
permissions:
|
||||
@@ -56,7 +51,7 @@ jobs:
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Typecheck
|
||||
run: pnpm -r typecheck
|
||||
@@ -67,12 +62,12 @@ jobs:
|
||||
- name: Build
|
||||
run: pnpm build
|
||||
|
||||
publish:
|
||||
if: startsWith(github.ref, 'refs/heads/release/')
|
||||
needs: verify
|
||||
publish_canary:
|
||||
if: github.event_name == 'push'
|
||||
needs: verify_canary
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 45
|
||||
environment: npm-release
|
||||
environment: npm-canary
|
||||
permissions:
|
||||
contents: write
|
||||
id-token: write
|
||||
@@ -95,34 +90,168 @@ jobs:
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --frozen-lockfile
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Restore tracked install-time changes
|
||||
run: git checkout -- pnpm-lock.yaml
|
||||
|
||||
- name: Configure git author
|
||||
run: |
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Run release script
|
||||
- name: Publish canary
|
||||
env:
|
||||
GITHUB_ACTIONS: "true"
|
||||
run: ./scripts/release.sh canary --skip-verify
|
||||
|
||||
- name: Push canary tag
|
||||
run: |
|
||||
tag="$(git tag --points-at HEAD | grep '^canary/v' | head -1)"
|
||||
if [ -z "$tag" ]; then
|
||||
echo "Error: no canary tag points at HEAD after release." >&2
|
||||
exit 1
|
||||
fi
|
||||
git push origin "refs/tags/${tag}"
|
||||
|
||||
verify_stable:
|
||||
if: github.event_name == 'workflow_dispatch'
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
ref: ${{ inputs.source_ref }}
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Typecheck
|
||||
run: pnpm -r typecheck
|
||||
|
||||
- name: Run tests
|
||||
run: pnpm test:run
|
||||
|
||||
- name: Build
|
||||
run: pnpm build
|
||||
|
||||
preview_stable:
|
||||
if: github.event_name == 'workflow_dispatch' && inputs.dry_run
|
||||
needs: verify_stable
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 45
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
ref: ${{ inputs.source_ref }}
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Dry-run stable release
|
||||
env:
|
||||
GITHUB_ACTIONS: "true"
|
||||
run: |
|
||||
args=("${{ inputs.bump }}")
|
||||
if [ "${{ inputs.channel }}" = "canary" ]; then
|
||||
args+=("--canary")
|
||||
fi
|
||||
if [ "${{ inputs.dry_run }}" = "true" ]; then
|
||||
args+=("--dry-run")
|
||||
args=(stable --skip-verify --dry-run)
|
||||
if [ -n "${{ inputs.stable_date }}" ]; then
|
||||
args+=(--date "${{ inputs.stable_date }}")
|
||||
fi
|
||||
./scripts/release.sh "${args[@]}"
|
||||
|
||||
- name: Push stable release branch commit and tag
|
||||
if: inputs.channel == 'stable' && !inputs.dry_run
|
||||
run: git push origin "HEAD:${GITHUB_REF_NAME}" --follow-tags
|
||||
publish_stable:
|
||||
if: github.event_name == 'workflow_dispatch' && !inputs.dry_run
|
||||
needs: verify_stable
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 45
|
||||
environment: npm-stable
|
||||
permissions:
|
||||
contents: write
|
||||
id-token: write
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
ref: ${{ inputs.source_ref }}
|
||||
|
||||
- name: Setup pnpm
|
||||
uses: pnpm/action-setup@v4
|
||||
with:
|
||||
version: 9.15.4
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 24
|
||||
cache: pnpm
|
||||
|
||||
- name: Install dependencies
|
||||
run: pnpm install --no-frozen-lockfile
|
||||
|
||||
- name: Restore tracked install-time changes
|
||||
run: git checkout -- pnpm-lock.yaml
|
||||
|
||||
- name: Configure git author
|
||||
run: |
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
|
||||
- name: Publish stable
|
||||
env:
|
||||
GITHUB_ACTIONS: "true"
|
||||
run: |
|
||||
args=(stable --skip-verify)
|
||||
if [ -n "${{ inputs.stable_date }}" ]; then
|
||||
args+=(--date "${{ inputs.stable_date }}")
|
||||
fi
|
||||
./scripts/release.sh "${args[@]}"
|
||||
|
||||
- name: Push stable tag
|
||||
run: |
|
||||
tag="$(git tag --points-at HEAD | grep '^v' | head -1)"
|
||||
if [ -z "$tag" ]; then
|
||||
echo "Error: no stable tag points at HEAD after release." >&2
|
||||
exit 1
|
||||
fi
|
||||
git push origin "refs/tags/${tag}"
|
||||
|
||||
- name: Create GitHub Release
|
||||
if: inputs.channel == 'stable' && !inputs.dry_run
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
PUBLISH_REMOTE: origin
|
||||
run: |
|
||||
version="$(git tag --points-at HEAD | grep '^v' | head -1 | sed 's/^v//')"
|
||||
if [ -z "$version" ]; then
|
||||
|
||||
2
.gitignore
vendored
2
.gitignore
vendored
@@ -46,5 +46,7 @@ tmp/
|
||||
# Playwright
|
||||
tests/e2e/test-results/
|
||||
tests/e2e/playwright-report/
|
||||
tests/release-smoke/test-results/
|
||||
tests/release-smoke/playwright-report/
|
||||
.superset/
|
||||
.claude/worktrees/
|
||||
|
||||
@@ -120,6 +120,7 @@ Useful overrides:
|
||||
```sh
|
||||
HOST_PORT=3200 PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
PAPERCLIP_DEPLOYMENT_MODE=authenticated PAPERCLIP_DEPLOYMENT_EXPOSURE=private ./scripts/docker-onboard-smoke.sh
|
||||
SMOKE_DETACH=true SMOKE_METADATA_FILE=/tmp/paperclip-smoke.env PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Notes:
|
||||
@@ -131,4 +132,5 @@ Notes:
|
||||
- Smoke script also defaults `PAPERCLIP_PUBLIC_URL` to `http://localhost:<HOST_PORT>` so bootstrap invite URLs and auth callbacks use the reachable host port instead of the container's internal `3100`.
|
||||
- In authenticated mode, the smoke script defaults `SMOKE_AUTO_BOOTSTRAP=true` and drives the real bootstrap path automatically: it signs up a real user, runs `paperclipai auth bootstrap-ceo` inside the container to mint a real bootstrap invite, accepts that invite over HTTP, and verifies board session access.
|
||||
- Run the script in the foreground to watch the onboarding flow; stop with `Ctrl+C` after validation.
|
||||
- Set `SMOKE_DETACH=true` to leave the container running for automation and optionally write shell-ready metadata to `SMOKE_METADATA_FILE`.
|
||||
- The image definition is in `Dockerfile.onboard-smoke`.
|
||||
|
||||
@@ -1,18 +1,19 @@
|
||||
# Publishing to npm
|
||||
|
||||
Low-level reference for how Paperclip packages are built for npm.
|
||||
Low-level reference for how Paperclip packages are prepared and published to npm.
|
||||
|
||||
For the maintainer release workflow, use [doc/RELEASING.md](RELEASING.md). This document is only about packaging internals and the scripts that produce publishable artifacts.
|
||||
For the maintainer workflow, use [doc/RELEASING.md](RELEASING.md). This document focuses on packaging internals.
|
||||
|
||||
## Current Release Entry Points
|
||||
|
||||
Use these scripts instead of older one-off publish commands:
|
||||
Use these scripts:
|
||||
|
||||
- [`scripts/release-start.sh`](../scripts/release-start.sh) to create or resume `release/X.Y.Z`
|
||||
- [`scripts/release-preflight.sh`](../scripts/release-preflight.sh) before any canary or stable release
|
||||
- [`scripts/release.sh`](../scripts/release.sh) for canary and stable npm publishes
|
||||
- [`scripts/rollback-latest.sh`](../scripts/rollback-latest.sh) to repoint `latest` during rollback
|
||||
- [`scripts/create-github-release.sh`](../scripts/create-github-release.sh) after pushing the stable branch tag
|
||||
- [`scripts/release.sh`](../scripts/release.sh) for canary and stable publish flows
|
||||
- [`scripts/create-github-release.sh`](../scripts/create-github-release.sh) after pushing a stable tag
|
||||
- [`scripts/rollback-latest.sh`](../scripts/rollback-latest.sh) to repoint `latest`
|
||||
- [`scripts/build-npm.sh`](../scripts/build-npm.sh) for the CLI packaging build
|
||||
|
||||
Paperclip no longer uses release branches or Changesets for publishing.
|
||||
|
||||
## Why the CLI needs special packaging
|
||||
|
||||
@@ -23,7 +24,7 @@ The CLI package, `paperclipai`, imports code from workspace packages such as:
|
||||
- `@paperclipai/shared`
|
||||
- adapter packages under `packages/adapters/`
|
||||
|
||||
Those workspace references use `workspace:*` during development. npm cannot install those references directly for end users, so the release build has to transform the CLI into a publishable standalone package.
|
||||
Those workspace references are valid in development but not in a publishable npm package. The release flow rewrites versions temporarily, then builds a publishable CLI bundle.
|
||||
|
||||
## `build-npm.sh`
|
||||
|
||||
@@ -33,89 +34,107 @@ Run:
|
||||
./scripts/build-npm.sh
|
||||
```
|
||||
|
||||
This script does six things:
|
||||
This script:
|
||||
|
||||
1. Runs the forbidden token check unless `--skip-checks` is supplied
|
||||
2. Runs `pnpm -r typecheck`
|
||||
3. Bundles the CLI entrypoint with esbuild into `cli/dist/index.js`
|
||||
4. Verifies the bundled entrypoint with `node --check`
|
||||
5. Rewrites `cli/package.json` into a publishable npm manifest and stores the dev copy as `cli/package.dev.json`
|
||||
6. Copies the repo `README.md` into `cli/README.md` for npm package metadata
|
||||
1. runs the forbidden token check unless `--skip-checks` is supplied
|
||||
2. runs `pnpm -r typecheck`
|
||||
3. bundles the CLI entrypoint with esbuild into `cli/dist/index.js`
|
||||
4. verifies the bundled entrypoint with `node --check`
|
||||
5. rewrites `cli/package.json` into a publishable npm manifest and stores the dev copy as `cli/package.dev.json`
|
||||
6. copies the repo `README.md` into `cli/README.md` for npm metadata
|
||||
|
||||
`build-npm.sh` is used by the release script so that npm users install a real package rather than unresolved workspace dependencies.
|
||||
After the release script exits, the dev manifest and temporary files are restored automatically.
|
||||
|
||||
## Publishable CLI layout
|
||||
## Package discovery and versioning
|
||||
|
||||
During development, [`cli/package.json`](../cli/package.json) contains workspace references.
|
||||
|
||||
During release preparation:
|
||||
|
||||
- `cli/package.json` becomes a publishable manifest with external npm dependency ranges
|
||||
- `cli/package.dev.json` stores the development manifest temporarily
|
||||
- `cli/dist/index.js` contains the bundled CLI entrypoint
|
||||
- `cli/README.md` is copied in for npm metadata
|
||||
|
||||
After release finalization, the release script restores the development manifest and removes the temporary README copy.
|
||||
|
||||
## Package discovery
|
||||
|
||||
The release tooling scans the workspace for public packages under:
|
||||
Public packages are discovered from:
|
||||
|
||||
- `packages/`
|
||||
- `server/`
|
||||
- `cli/`
|
||||
|
||||
`ui/` remains ignored for npm publishing because it is private.
|
||||
`ui/` is ignored because it is private.
|
||||
|
||||
This matters because all public packages are versioned and published together as one release unit.
|
||||
The version rewrite step now uses [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs), which:
|
||||
|
||||
## Canary packaging model
|
||||
- finds all public packages
|
||||
- sorts them topologically by internal dependencies
|
||||
- rewrites each package version to the target release version
|
||||
- rewrites internal `workspace:*` dependency references to the exact target version
|
||||
- updates the CLI's displayed version string
|
||||
|
||||
Canaries are published as semver prereleases such as:
|
||||
Those rewrites are temporary. The working tree is restored after publish or dry-run.
|
||||
|
||||
- `1.2.3-canary.0`
|
||||
- `1.2.3-canary.1`
|
||||
## Version formats
|
||||
|
||||
They are published under the npm dist-tag `canary`.
|
||||
Paperclip uses calendar versions:
|
||||
|
||||
This means:
|
||||
- stable: `YYYY.MDD.P`
|
||||
- canary: `YYYY.MDD.P-canary.N`
|
||||
|
||||
- `npx paperclipai@canary onboard` can install them explicitly
|
||||
- `npx paperclipai onboard` continues to resolve `latest`
|
||||
- the stable changelog can stay at `releases/v1.2.3.md`
|
||||
Examples:
|
||||
|
||||
## Stable packaging model
|
||||
- stable: `2026.318.0`
|
||||
- canary: `2026.318.1-canary.2`
|
||||
|
||||
Stable releases publish normal semver versions such as `1.2.3` under the npm dist-tag `latest`.
|
||||
## Publish model
|
||||
|
||||
The stable publish flow also creates the local release commit and git tag on `release/X.Y.Z`. Pushing that branch commit/tag, creating the GitHub Release, and merging the release branch back to `master` happen afterward as separate maintainer steps.
|
||||
### Canary
|
||||
|
||||
Canaries publish under the npm dist-tag `canary`.
|
||||
|
||||
Example:
|
||||
|
||||
- `paperclipai@2026.318.1-canary.2`
|
||||
|
||||
This keeps the default install path unchanged while allowing explicit installs with:
|
||||
|
||||
```bash
|
||||
npx paperclipai@canary onboard
|
||||
```
|
||||
|
||||
### Stable
|
||||
|
||||
Stable publishes use the npm dist-tag `latest`.
|
||||
|
||||
Example:
|
||||
|
||||
- `paperclipai@2026.318.0`
|
||||
|
||||
Stable publishes do not create a release commit. Instead:
|
||||
|
||||
- package versions are rewritten temporarily
|
||||
- packages are published from the chosen source commit
|
||||
- git tag `vYYYY.MDD.P` points at that original commit
|
||||
|
||||
## Trusted publishing
|
||||
|
||||
The intended CI model is npm trusted publishing through GitHub OIDC.
|
||||
|
||||
That means:
|
||||
|
||||
- no long-lived `NPM_TOKEN` in repository secrets
|
||||
- GitHub Actions obtains short-lived publish credentials
|
||||
- trusted publisher rules are configured per workflow file
|
||||
|
||||
See [doc/RELEASE-AUTOMATION-SETUP.md](RELEASE-AUTOMATION-SETUP.md) for the GitHub/npm setup steps.
|
||||
|
||||
## Rollback model
|
||||
|
||||
Rollback does not unpublish packages.
|
||||
Rollback does not unpublish anything.
|
||||
|
||||
Instead, the maintainer should move the `latest` dist-tag back to the previous good stable version with:
|
||||
It repoints the `latest` dist-tag to a prior stable version:
|
||||
|
||||
```bash
|
||||
./scripts/rollback-latest.sh <stable-version>
|
||||
./scripts/rollback-latest.sh 2026.318.0
|
||||
```
|
||||
|
||||
That keeps history intact while restoring the default install path quickly.
|
||||
|
||||
## Notes for CI
|
||||
|
||||
The repo includes a manual GitHub Actions release workflow at [`.github/workflows/release.yml`](../.github/workflows/release.yml).
|
||||
|
||||
Recommended CI release setup:
|
||||
|
||||
- use npm trusted publishing via GitHub OIDC
|
||||
- require approval through the `npm-release` environment
|
||||
- run releases from `release/X.Y.Z`
|
||||
- use canary first, then stable
|
||||
This is the fastest way to restore the default install path if a stable release is bad.
|
||||
|
||||
## Related Files
|
||||
|
||||
- [`scripts/build-npm.sh`](../scripts/build-npm.sh)
|
||||
- [`scripts/generate-npm-package-json.mjs`](../scripts/generate-npm-package-json.mjs)
|
||||
- [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs)
|
||||
- [`cli/esbuild.config.mjs`](../cli/esbuild.config.mjs)
|
||||
- [`doc/RELEASING.md`](RELEASING.md)
|
||||
|
||||
281
doc/RELEASE-AUTOMATION-SETUP.md
Normal file
281
doc/RELEASE-AUTOMATION-SETUP.md
Normal file
@@ -0,0 +1,281 @@
|
||||
# Release Automation Setup
|
||||
|
||||
This document covers the GitHub and npm setup required for the current Paperclip release model:
|
||||
|
||||
- automatic canaries from `master`
|
||||
- manual stable promotion from a chosen source ref
|
||||
- npm trusted publishing via GitHub OIDC
|
||||
- protected release infrastructure in a public repository
|
||||
|
||||
Repo-side files that depend on this setup:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
- `.github/CODEOWNERS`
|
||||
|
||||
Note:
|
||||
|
||||
- the release workflows intentionally use `pnpm install --no-frozen-lockfile`
|
||||
- this matches the repo's current policy where `pnpm-lock.yaml` is refreshed by GitHub automation after manifest changes land on `master`
|
||||
- the publish jobs then restore `pnpm-lock.yaml` before running `scripts/release.sh`, so the release script still sees a clean worktree
|
||||
|
||||
## 1. Merge the Repo Changes First
|
||||
|
||||
Before touching GitHub or npm settings, merge the release automation code so the referenced workflow filenames already exist on the default branch.
|
||||
|
||||
Required files:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
- `.github/CODEOWNERS`
|
||||
|
||||
## 2. Configure npm Trusted Publishing
|
||||
|
||||
Do this for every public package that Paperclip publishes.
|
||||
|
||||
At minimum that includes:
|
||||
|
||||
- `paperclipai`
|
||||
- `@paperclipai/server`
|
||||
- public packages under `packages/`
|
||||
|
||||
### 2.1. In npm, open each package settings page
|
||||
|
||||
For each package:
|
||||
|
||||
1. open npm as an owner of the package
|
||||
2. go to the package settings / publishing access area
|
||||
3. add a trusted publisher for the GitHub repository `paperclipai/paperclip`
|
||||
|
||||
### 2.2. Add one trusted publisher entry per package
|
||||
|
||||
npm currently allows one trusted publisher configuration per package.
|
||||
|
||||
Configure:
|
||||
|
||||
- workflow: `.github/workflows/release.yml`
|
||||
|
||||
Repository:
|
||||
|
||||
- `paperclipai/paperclip`
|
||||
|
||||
Environment name:
|
||||
|
||||
- leave the npm trusted-publisher environment field blank
|
||||
|
||||
Why:
|
||||
|
||||
- the single `release.yml` workflow handles both canary and stable publishing
|
||||
- GitHub environments `npm-canary` and `npm-stable` still enforce different approval rules on the GitHub side
|
||||
|
||||
### 2.3. Verify trusted publishing before removing old auth
|
||||
|
||||
After the workflows are live:
|
||||
|
||||
1. run a canary publish
|
||||
2. confirm npm publish succeeds without any `NPM_TOKEN`
|
||||
3. run a stable dry-run
|
||||
4. run one real stable publish
|
||||
|
||||
Only after that should you remove old token-based access.
|
||||
|
||||
## 3. Remove Legacy npm Tokens
|
||||
|
||||
After trusted publishing works:
|
||||
|
||||
1. revoke any repository or organization `NPM_TOKEN` secrets used for publish
|
||||
2. revoke any personal automation token that used to publish Paperclip
|
||||
3. if npm offers a package-level setting to restrict publishing to trusted publishers, enable it
|
||||
|
||||
Goal:
|
||||
|
||||
- no long-lived npm publishing token should remain in GitHub Actions
|
||||
|
||||
## 4. Create GitHub Environments
|
||||
|
||||
Create two environments in the GitHub repository:
|
||||
|
||||
- `npm-canary`
|
||||
- `npm-stable`
|
||||
|
||||
Path:
|
||||
|
||||
1. GitHub repository
|
||||
2. `Settings`
|
||||
3. `Environments`
|
||||
4. `New environment`
|
||||
|
||||
## 5. Configure `npm-canary`
|
||||
|
||||
Recommended settings for `npm-canary`:
|
||||
|
||||
- environment name: `npm-canary`
|
||||
- required reviewers: none
|
||||
- wait timer: none
|
||||
- deployment branches and tags:
|
||||
- selected branches only
|
||||
- allow `master`
|
||||
|
||||
Reasoning:
|
||||
|
||||
- every push to `master` should be able to publish a canary automatically
|
||||
- no human approval should be required for canaries
|
||||
|
||||
## 6. Configure `npm-stable`
|
||||
|
||||
Recommended settings for `npm-stable`:
|
||||
|
||||
- environment name: `npm-stable`
|
||||
- required reviewers: at least one maintainer other than the person triggering the workflow when possible
|
||||
- prevent self-review: enabled
|
||||
- admin bypass: disabled if your team can tolerate it
|
||||
- wait timer: optional
|
||||
- deployment branches and tags:
|
||||
- selected branches only
|
||||
- allow `master`
|
||||
|
||||
Reasoning:
|
||||
|
||||
- stable publishes should require an explicit human approval gate
|
||||
- the workflow is manual, but the environment should still be the real control point
|
||||
|
||||
## 7. Protect `master`
|
||||
|
||||
Open the branch protection settings for `master`.
|
||||
|
||||
Recommended rules:
|
||||
|
||||
1. require pull requests before merging
|
||||
2. require status checks to pass before merging
|
||||
3. require review from code owners
|
||||
4. dismiss stale approvals when new commits are pushed
|
||||
5. restrict who can push directly to `master`
|
||||
|
||||
At minimum, make sure workflow and release script changes cannot land without review.
|
||||
|
||||
## 8. Enforce CODEOWNERS Review
|
||||
|
||||
This repo now includes `.github/CODEOWNERS`, but GitHub only enforces it if branch protection requires code owner reviews.
|
||||
|
||||
In branch protection for `master`, enable:
|
||||
|
||||
- `Require review from Code Owners`
|
||||
|
||||
Then verify the owner entries are correct for your actual maintainer set.
|
||||
|
||||
Current file:
|
||||
|
||||
- `.github/CODEOWNERS`
|
||||
|
||||
If `@cryppadotta` is not the right reviewer identity in the public repo, change it before enabling enforcement.
|
||||
|
||||
## 9. Protect Release Infrastructure Specifically
|
||||
|
||||
These files should always trigger code owner review:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
- `scripts/release.sh`
|
||||
- `scripts/release-lib.sh`
|
||||
- `scripts/release-package-map.mjs`
|
||||
- `scripts/create-github-release.sh`
|
||||
- `scripts/rollback-latest.sh`
|
||||
- `doc/RELEASING.md`
|
||||
- `doc/PUBLISHING.md`
|
||||
|
||||
If you want stronger controls, add a repository ruleset that explicitly blocks direct pushes to:
|
||||
|
||||
- `.github/workflows/**`
|
||||
- `scripts/release*`
|
||||
|
||||
## 10. Do Not Store a Claude Token in GitHub Actions
|
||||
|
||||
Do not add a personal Claude or Anthropic token for automatic changelog generation.
|
||||
|
||||
Recommended policy:
|
||||
|
||||
- stable changelog generation happens locally from a trusted maintainer machine
|
||||
- canaries never generate changelogs
|
||||
|
||||
This keeps LLM spending intentional and avoids a high-value token sitting in Actions.
|
||||
|
||||
## 11. Verify the Canary Workflow
|
||||
|
||||
After setup:
|
||||
|
||||
1. merge a harmless commit to `master`
|
||||
2. open the `Release` workflow run triggered by that push
|
||||
3. confirm it passes verification
|
||||
4. confirm publish succeeds under the `npm-canary` environment
|
||||
5. confirm npm now shows a new `canary` release
|
||||
6. confirm a git tag named `canary/vYYYY.MDD.P-canary.N` was pushed
|
||||
|
||||
Install-path check:
|
||||
|
||||
```bash
|
||||
npx paperclipai@canary onboard
|
||||
```
|
||||
|
||||
## 12. Verify the Stable Workflow
|
||||
|
||||
After at least one good canary exists:
|
||||
|
||||
1. resolve the target stable version with `./scripts/release.sh stable --date YYYY-MM-DD --print-version`
|
||||
2. prepare `releases/vYYYY.MDD.P.md` on the source commit you want to promote
|
||||
3. open `Actions` -> `Release`
|
||||
4. run it with:
|
||||
- `source_ref`: the tested commit SHA or canary tag source commit
|
||||
- `stable_date`: leave blank or set the intended UTC date like `2026-03-18`
|
||||
do not enter a version like `2026.318.0`; the workflow computes that from the date
|
||||
- `dry_run`: `true`
|
||||
5. confirm the dry-run succeeds
|
||||
6. rerun with `dry_run: false`
|
||||
7. approve the `npm-stable` environment when prompted
|
||||
8. confirm npm `latest` points to the new stable version
|
||||
9. confirm git tag `vYYYY.MDD.P` exists
|
||||
10. confirm the GitHub Release was created
|
||||
|
||||
Implementation note:
|
||||
|
||||
- the GitHub Actions stable workflow calls `create-github-release.sh` with `PUBLISH_REMOTE=origin`
|
||||
- local maintainer usage can still pass `PUBLISH_REMOTE=public-gh` explicitly when needed
|
||||
|
||||
## 13. Suggested Maintainer Policy
|
||||
|
||||
Use this policy going forward:
|
||||
|
||||
- canaries are automatic and cheap
|
||||
- stables are manual and approved
|
||||
- only stables get public notes and announcements
|
||||
- release notes are committed before stable publish
|
||||
- rollback uses `npm dist-tag`, not unpublish
|
||||
|
||||
## 14. Troubleshooting
|
||||
|
||||
### Trusted publishing fails with an auth error
|
||||
|
||||
Check:
|
||||
|
||||
1. the workflow filename on GitHub exactly matches the filename configured in npm
|
||||
2. the package has the trusted publisher entry for the correct repository
|
||||
3. the job has `id-token: write`
|
||||
4. the job is running from the expected repository, not a fork
|
||||
|
||||
### Stable workflow runs but never asks for approval
|
||||
|
||||
Check:
|
||||
|
||||
1. the `publish` job uses environment `npm-stable`
|
||||
2. the environment actually has required reviewers configured
|
||||
3. the workflow is running in the canonical repository, not a fork
|
||||
|
||||
### CODEOWNERS does not trigger
|
||||
|
||||
Check:
|
||||
|
||||
1. `.github/CODEOWNERS` is on the default branch
|
||||
2. branch protection on `master` requires code owner review
|
||||
3. the owner identities in the file are valid reviewers with repository access
|
||||
|
||||
## Related Docs
|
||||
|
||||
- [doc/RELEASING.md](RELEASING.md)
|
||||
- [doc/PUBLISHING.md](PUBLISHING.md)
|
||||
- [doc/plans/2026-03-17-release-automation-and-versioning.md](plans/2026-03-17-release-automation-and-versioning.md)
|
||||
477
doc/RELEASING.md
477
doc/RELEASING.md
@@ -1,220 +1,174 @@
|
||||
# Releasing Paperclip
|
||||
|
||||
Maintainer runbook for shipping a full Paperclip release across npm, GitHub, and the website-facing changelog surface.
|
||||
Maintainer runbook for shipping Paperclip across npm, GitHub, and the website-facing changelog surface.
|
||||
|
||||
The release model is branch-driven:
|
||||
The release model is now commit-driven:
|
||||
|
||||
1. Start a release train on `release/X.Y.Z`
|
||||
2. Draft the stable changelog on that branch
|
||||
3. Publish one or more canaries from that branch
|
||||
4. Publish stable from that same branch head
|
||||
5. Push the branch commit and tag
|
||||
6. Create the GitHub Release
|
||||
7. Merge `release/X.Y.Z` back to `master` without squash or rebase
|
||||
1. Every push to `master` publishes a canary automatically.
|
||||
2. Stable releases are manually promoted from a chosen tested commit or canary tag.
|
||||
3. Stable release notes live in `releases/vYYYY.MDD.P.md`.
|
||||
4. Only stable releases get GitHub Releases.
|
||||
|
||||
## Versioning Model
|
||||
|
||||
Paperclip uses calendar versions that still fit semver syntax:
|
||||
|
||||
- stable: `YYYY.MDD.P`
|
||||
- canary: `YYYY.MDD.P-canary.N`
|
||||
|
||||
Examples:
|
||||
|
||||
- first stable on March 18, 2026: `2026.318.0`
|
||||
- second stable on March 18, 2026: `2026.318.1`
|
||||
- fourth canary for the `2026.318.1` line: `2026.318.1-canary.3`
|
||||
|
||||
Important constraints:
|
||||
|
||||
- the middle numeric slot is `MDD`, where `M` is the UTC month and `DD` is the zero-padded UTC day
|
||||
- use `2026.303.0` for March 3, not `2026.33.0`
|
||||
- do not use leading zeroes such as `2026.0318.0`
|
||||
- do not use four numeric segments such as `2026.3.18.1`
|
||||
- the semver-safe canary form is `2026.318.0-canary.1`
|
||||
|
||||
## Release Surfaces
|
||||
|
||||
Every release has four separate surfaces:
|
||||
Every stable release has four separate surfaces:
|
||||
|
||||
1. **Verification** — the exact git SHA passes typecheck, tests, and build
|
||||
2. **npm** — `paperclipai` and public workspace packages are published
|
||||
3. **GitHub** — the stable release gets a git tag and GitHub Release
|
||||
4. **Website / announcements** — the stable changelog is published externally and announced
|
||||
|
||||
A release is done only when all four surfaces are handled.
|
||||
A stable release is done only when all four surfaces are handled.
|
||||
|
||||
Canaries only cover the first two surfaces plus an internal traceability tag.
|
||||
|
||||
## Core Invariants
|
||||
|
||||
- Canary and stable for `X.Y.Z` must come from the same `release/X.Y.Z` branch.
|
||||
- The release scripts must run from the matching `release/X.Y.Z` branch.
|
||||
- Once `vX.Y.Z` exists locally, on GitHub, or on npm, that release train is frozen.
|
||||
- Do not squash-merge or rebase-merge a release branch PR back to `master`.
|
||||
- The stable changelog is always `releases/vX.Y.Z.md`. Never create canary changelog files.
|
||||
|
||||
The reason for the merge rule is simple: the tag must keep pointing at the exact published commit. Squash or rebase breaks that property.
|
||||
- canaries publish from `master`
|
||||
- stables publish from an explicitly chosen source ref
|
||||
- tags point at the original source commit, not a generated release commit
|
||||
- stable notes are always `releases/vYYYY.MDD.P.md`
|
||||
- canaries never create GitHub Releases
|
||||
- canaries never require changelog generation
|
||||
|
||||
## TL;DR
|
||||
|
||||
### 1. Start the release train
|
||||
### Canary
|
||||
|
||||
Use this to compute the next version, create or resume the branch, create or resume a dedicated worktree, and push the branch to GitHub.
|
||||
Every push to `master` runs the canary path inside [`.github/workflows/release.yml`](../.github/workflows/release.yml).
|
||||
|
||||
```bash
|
||||
./scripts/release-start.sh patch
|
||||
```
|
||||
It:
|
||||
|
||||
That script:
|
||||
|
||||
- fetches the release remote and tags
|
||||
- computes the next stable version from the latest `v*` tag
|
||||
- creates or resumes `release/X.Y.Z`
|
||||
- creates or resumes a dedicated worktree
|
||||
- pushes the branch to the remote by default
|
||||
- refuses to reuse a frozen release train
|
||||
|
||||
### 2. Draft the stable changelog
|
||||
|
||||
From the release worktree:
|
||||
|
||||
```bash
|
||||
VERSION=X.Y.Z
|
||||
claude --print --output-format stream-json --verbose --dangerously-skip-permissions --model claude-opus-4-6 "Use the release-changelog skill to draft or update releases/v${VERSION}.md for Paperclip. Read doc/RELEASING.md and .agents/skills/release-changelog/SKILL.md, then generate the stable changelog for v${VERSION} from commits since the last stable tag. Do not create a canary changelog."
|
||||
```
|
||||
|
||||
### 3. Verify and publish a canary
|
||||
|
||||
```bash
|
||||
./scripts/release-preflight.sh canary patch
|
||||
./scripts/release.sh patch --canary --dry-run
|
||||
./scripts/release.sh patch --canary
|
||||
PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
- verifies the pushed commit
|
||||
- computes the canary version for the current UTC date
|
||||
- publishes under npm dist-tag `canary`
|
||||
- creates a git tag `canary/vYYYY.MDD.P-canary.N`
|
||||
|
||||
Users install canaries with:
|
||||
|
||||
```bash
|
||||
npx paperclipai@canary onboard
|
||||
```
|
||||
|
||||
### 4. Publish stable
|
||||
|
||||
```bash
|
||||
./scripts/release-preflight.sh stable patch
|
||||
./scripts/release.sh patch --dry-run
|
||||
./scripts/release.sh patch
|
||||
git push public-gh HEAD --follow-tags
|
||||
./scripts/create-github-release.sh X.Y.Z
|
||||
```
|
||||
|
||||
Then open a PR from `release/X.Y.Z` to `master` and merge without squash or rebase.
|
||||
|
||||
## Release Branches
|
||||
|
||||
Paperclip uses one release branch per target stable version:
|
||||
|
||||
- `release/0.3.0`
|
||||
- `release/0.3.1`
|
||||
- `release/1.0.0`
|
||||
|
||||
Do not create separate per-canary branches like `canary/0.3.0-1`. A canary is just a prerelease snapshot of the same stable train.
|
||||
|
||||
## Script Entry Points
|
||||
|
||||
- [`scripts/release-start.sh`](../scripts/release-start.sh) — create or resume the release train branch/worktree
|
||||
- [`scripts/release-preflight.sh`](../scripts/release-preflight.sh) — validate branch, version plan, git/npm state, and verification gate
|
||||
- [`scripts/release.sh`](../scripts/release.sh) — publish canary or stable from the release branch
|
||||
- [`scripts/create-github-release.sh`](../scripts/create-github-release.sh) — create or update the GitHub Release after pushing the tag
|
||||
- [`scripts/rollback-latest.sh`](../scripts/rollback-latest.sh) — repoint `latest` to the last good stable version
|
||||
|
||||
## Detailed Workflow
|
||||
|
||||
### 1. Start or resume the release train
|
||||
|
||||
Run:
|
||||
|
||||
```bash
|
||||
./scripts/release-start.sh <patch|minor|major>
|
||||
```
|
||||
|
||||
Useful options:
|
||||
|
||||
```bash
|
||||
./scripts/release-start.sh patch --dry-run
|
||||
./scripts/release-start.sh minor --worktree-dir ../paperclip-release-0.4.0
|
||||
./scripts/release-start.sh patch --no-push
|
||||
```
|
||||
|
||||
The script is intentionally idempotent:
|
||||
|
||||
- if `release/X.Y.Z` already exists locally, it reuses it
|
||||
- if the branch already exists on the remote, it resumes it locally
|
||||
- if the branch is already checked out in another worktree, it points you there
|
||||
- if `vX.Y.Z` already exists locally, remotely, or on npm, it refuses to reuse that train
|
||||
|
||||
### 2. Write the stable changelog early
|
||||
|
||||
Create or update:
|
||||
|
||||
- `releases/vX.Y.Z.md`
|
||||
|
||||
That file is for the eventual stable release. It should not include `-canary` in the filename or heading.
|
||||
|
||||
Recommended structure:
|
||||
|
||||
- `Breaking Changes` when needed
|
||||
- `Highlights`
|
||||
- `Improvements`
|
||||
- `Fixes`
|
||||
- `Upgrade Guide` when needed
|
||||
- `Contributors` — @-mention every contributor by GitHub username (no emails)
|
||||
|
||||
Package-level `CHANGELOG.md` files are generated as part of the release mechanics. They are not the main release narrative.
|
||||
|
||||
### 3. Run release preflight
|
||||
|
||||
From the `release/X.Y.Z` worktree:
|
||||
|
||||
```bash
|
||||
./scripts/release-preflight.sh canary <patch|minor|major>
|
||||
# or
|
||||
./scripts/release-preflight.sh stable <patch|minor|major>
|
||||
npx paperclipai@canary onboard --data-dir "$(mktemp -d /tmp/paperclip-canary.XXXXXX)"
|
||||
```
|
||||
|
||||
The preflight script now checks all of the following before it runs the verification gate:
|
||||
### Stable
|
||||
|
||||
- the worktree is clean, including untracked files
|
||||
- the current branch matches the computed `release/X.Y.Z`
|
||||
- the release train is not frozen
|
||||
- the target version is still free on npm
|
||||
- the target tag does not already exist locally or remotely
|
||||
- whether the remote release branch already exists
|
||||
- whether `releases/vX.Y.Z.md` is present
|
||||
Use [`.github/workflows/release.yml`](../.github/workflows/release.yml) from the Actions tab with the manual `workflow_dispatch` inputs.
|
||||
|
||||
Then it runs:
|
||||
[Run the action here](https://github.com/paperclipai/paperclip/actions/workflows/release.yml)
|
||||
|
||||
Inputs:
|
||||
|
||||
- `source_ref`
|
||||
- commit SHA, branch, or tag
|
||||
- `stable_date`
|
||||
- optional UTC date override in `YYYY-MM-DD`
|
||||
- enter a date like `2026-03-18`, not a version like `2026.318.0`
|
||||
- `dry_run`
|
||||
- preview only when true
|
||||
|
||||
Before running stable:
|
||||
|
||||
1. pick the canary commit or tag you trust
|
||||
2. resolve the target stable version with `./scripts/release.sh stable --date "$(date +%F)" --print-version`
|
||||
3. create or update `releases/vYYYY.MDD.P.md` on that source ref
|
||||
4. run the stable workflow from that source ref
|
||||
|
||||
Example:
|
||||
|
||||
- `source_ref`: `master`
|
||||
- `stable_date`: `2026-03-18`
|
||||
- resulting stable version: `2026.318.0`
|
||||
|
||||
The workflow:
|
||||
|
||||
- re-verifies the exact source ref
|
||||
- computes the next stable patch slot for the chosen UTC date
|
||||
- publishes `YYYY.MDD.P` under npm dist-tag `latest`
|
||||
- creates git tag `vYYYY.MDD.P`
|
||||
- creates or updates the GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||
|
||||
## Local Commands
|
||||
|
||||
### Preview a canary locally
|
||||
|
||||
```bash
|
||||
pnpm -r typecheck
|
||||
pnpm test:run
|
||||
pnpm build
|
||||
./scripts/release.sh canary --dry-run
|
||||
```
|
||||
|
||||
### 4. Publish one or more canaries
|
||||
|
||||
Run:
|
||||
### Preview a stable locally
|
||||
|
||||
```bash
|
||||
./scripts/release.sh <patch|minor|major> --canary --dry-run
|
||||
./scripts/release.sh <patch|minor|major> --canary
|
||||
./scripts/release.sh stable --dry-run
|
||||
```
|
||||
|
||||
Result:
|
||||
### Publish a stable locally
|
||||
|
||||
- npm gets a prerelease such as `1.2.3-canary.0` under dist-tag `canary`
|
||||
- `latest` is unchanged
|
||||
- no git tag is created
|
||||
- no GitHub Release is created
|
||||
- the worktree returns to clean after the script finishes
|
||||
This is mainly for emergency/manual use. The normal path is the GitHub workflow.
|
||||
|
||||
Guardrails:
|
||||
```bash
|
||||
./scripts/release.sh stable
|
||||
git push public-gh refs/tags/vYYYY.MDD.P
|
||||
PUBLISH_REMOTE=public-gh ./scripts/create-github-release.sh YYYY.MDD.P
|
||||
```
|
||||
|
||||
- the script refuses to run from the wrong branch
|
||||
- the script refuses to publish from a frozen train
|
||||
- the canary is always derived from the next stable version
|
||||
- if the stable notes file is missing, the script warns before you forget it
|
||||
## Stable Changelog Workflow
|
||||
|
||||
Concrete example:
|
||||
Stable changelog files live at:
|
||||
|
||||
- if the latest stable is `0.2.7`, a patch canary targets `0.2.8-canary.0`
|
||||
- `0.2.7-canary.N` is invalid because `0.2.7` is already stable
|
||||
- `releases/vYYYY.MDD.P.md`
|
||||
|
||||
### 5. Smoke test the canary
|
||||
Canaries do not get changelog files.
|
||||
|
||||
Run the actual install path in Docker:
|
||||
Recommended local generation flow:
|
||||
|
||||
```bash
|
||||
VERSION="$(./scripts/release.sh stable --date 2026-03-18 --print-version)"
|
||||
claude --print --output-format stream-json --verbose --dangerously-skip-permissions --model claude-opus-4-6 "Use the release-changelog skill to draft or update releases/v${VERSION}.md for Paperclip. Read doc/RELEASING.md and .agents/skills/release-changelog/SKILL.md, then generate the stable changelog for v${VERSION} from commits since the last stable tag. Do not create a canary changelog."
|
||||
```
|
||||
|
||||
The repo intentionally does not run this through GitHub Actions because:
|
||||
|
||||
- canaries are too frequent
|
||||
- stable notes are the only public narrative surface that needs LLM help
|
||||
- maintainer LLM tokens should not live in Actions
|
||||
|
||||
## Smoke Testing
|
||||
|
||||
For a canary:
|
||||
|
||||
```bash
|
||||
PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
For the current stable:
|
||||
|
||||
```bash
|
||||
PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
Useful isolated variants:
|
||||
|
||||
```bash
|
||||
@@ -222,201 +176,76 @@ HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary .
|
||||
HOST_PORT=3233 DATA_DIR=./data/release-smoke-stable PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
If you want to exercise onboarding from the current committed ref instead of npm, use:
|
||||
Automated browser smoke is also available:
|
||||
|
||||
```bash
|
||||
./scripts/clean-onboard-ref.sh
|
||||
PAPERCLIP_PORT=3234 ./scripts/clean-onboard-ref.sh
|
||||
./scripts/clean-onboard-ref.sh HEAD
|
||||
gh workflow run release-smoke.yml -f paperclip_version=canary
|
||||
gh workflow run release-smoke.yml -f paperclip_version=latest
|
||||
```
|
||||
|
||||
Minimum checks:
|
||||
|
||||
- `npx paperclipai@canary onboard` installs
|
||||
- onboarding completes without crashes
|
||||
- the server boots
|
||||
- the UI loads
|
||||
- basic company creation and dashboard load work
|
||||
- authenticated login works with the smoke credentials
|
||||
- the browser lands in onboarding on a fresh instance
|
||||
- company creation succeeds
|
||||
- the first CEO agent is created
|
||||
- the first CEO heartbeat run is triggered
|
||||
|
||||
If smoke testing fails:
|
||||
## Rollback
|
||||
|
||||
1. stop the stable release
|
||||
2. fix the issue on the same `release/X.Y.Z` branch
|
||||
3. publish another canary
|
||||
4. rerun smoke testing
|
||||
Rollback does not unpublish versions.
|
||||
|
||||
### 6. Publish stable from the same release branch
|
||||
|
||||
Once the branch head is vetted, run:
|
||||
It only moves the `latest` dist-tag back to a previous stable:
|
||||
|
||||
```bash
|
||||
./scripts/release.sh <patch|minor|major> --dry-run
|
||||
./scripts/release.sh <patch|minor|major>
|
||||
./scripts/rollback-latest.sh 2026.318.0 --dry-run
|
||||
./scripts/rollback-latest.sh 2026.318.0
|
||||
```
|
||||
|
||||
Stable publish:
|
||||
|
||||
- publishes `X.Y.Z` to npm under `latest`
|
||||
- creates the local release commit
|
||||
- creates the local tag `vX.Y.Z`
|
||||
|
||||
Stable publish refuses to proceed if:
|
||||
|
||||
- the current branch is not `release/X.Y.Z`
|
||||
- the remote release branch does not exist yet
|
||||
- the stable notes file is missing
|
||||
- the target tag already exists locally or remotely
|
||||
- the stable version already exists on npm
|
||||
|
||||
Those checks intentionally freeze the train after stable publish.
|
||||
|
||||
### 7. Push the stable branch commit and tag
|
||||
|
||||
After stable publish succeeds:
|
||||
|
||||
```bash
|
||||
git push public-gh HEAD --follow-tags
|
||||
./scripts/create-github-release.sh X.Y.Z
|
||||
```
|
||||
|
||||
The GitHub Release notes come from:
|
||||
|
||||
- `releases/vX.Y.Z.md`
|
||||
|
||||
### 8. Merge the release branch back to `master`
|
||||
|
||||
Open a PR:
|
||||
|
||||
- base: `master`
|
||||
- head: `release/X.Y.Z`
|
||||
|
||||
Merge rule:
|
||||
|
||||
- allowed: merge commit or fast-forward
|
||||
- forbidden: squash merge
|
||||
- forbidden: rebase merge
|
||||
|
||||
Post-merge verification:
|
||||
|
||||
```bash
|
||||
git fetch public-gh --tags
|
||||
git merge-base --is-ancestor "vX.Y.Z" "public-gh/master"
|
||||
```
|
||||
|
||||
That command must succeed. If it fails, the published tagged commit is not reachable from `master`, which means the merge strategy was wrong.
|
||||
|
||||
### 9. Finish the external surfaces
|
||||
|
||||
After GitHub is correct:
|
||||
|
||||
- publish the changelog on the website
|
||||
- write and send the announcement copy
|
||||
- ensure public docs and install guidance point to the stable version
|
||||
|
||||
## GitHub Actions Release
|
||||
|
||||
There is also a manual workflow at [`.github/workflows/release.yml`](../.github/workflows/release.yml).
|
||||
|
||||
Use it from the Actions tab on the relevant `release/X.Y.Z` branch:
|
||||
|
||||
1. Choose `Release`
|
||||
2. Choose `channel`: `canary` or `stable`
|
||||
3. Choose `bump`: `patch`, `minor`, or `major`
|
||||
4. Choose whether this is a `dry_run`
|
||||
5. Run it from the release branch, not from `master`
|
||||
|
||||
The workflow:
|
||||
|
||||
- reruns `typecheck`, `test:run`, and `build`
|
||||
- gates publish behind the `npm-release` environment
|
||||
- can publish canaries without touching `latest`
|
||||
- can publish stable, push the stable branch commit and tag, and create the GitHub Release
|
||||
|
||||
It does not merge the release branch back to `master` for you.
|
||||
|
||||
## Release Checklist
|
||||
|
||||
### Before any publish
|
||||
|
||||
- [ ] The release train exists on `release/X.Y.Z`
|
||||
- [ ] The working tree is clean, including untracked files
|
||||
- [ ] If package manifests changed, the CI-owned `pnpm-lock.yaml` refresh is already merged on `master` before the train is cut
|
||||
- [ ] The required verification gate passed on the exact branch head you want to publish
|
||||
- [ ] The bump type is correct for the user-visible impact
|
||||
- [ ] The stable changelog file exists or is ready at `releases/vX.Y.Z.md`
|
||||
- [ ] You know which previous stable version you would roll back to if needed
|
||||
|
||||
### Before a stable
|
||||
|
||||
- [ ] The candidate has already passed smoke testing
|
||||
- [ ] The remote `release/X.Y.Z` branch exists
|
||||
- [ ] You are ready to push the stable branch commit and tag immediately after npm publish
|
||||
- [ ] You are ready to create the GitHub Release immediately after the push
|
||||
- [ ] You are ready to open the PR back to `master`
|
||||
|
||||
### After a stable
|
||||
|
||||
- [ ] `npm view paperclipai@latest version` matches the new stable version
|
||||
- [ ] The git tag exists on GitHub
|
||||
- [ ] The GitHub Release exists and uses `releases/vX.Y.Z.md`
|
||||
- [ ] `vX.Y.Z` is reachable from `master`
|
||||
- [ ] The website changelog is updated
|
||||
- [ ] Announcement copy matches the stable release, not the canary
|
||||
Then fix forward with a new stable patch slot or release date.
|
||||
|
||||
## Failure Playbooks
|
||||
|
||||
### If the canary publishes but the smoke test fails
|
||||
### If the canary publishes but smoke testing fails
|
||||
|
||||
Do not publish stable.
|
||||
Do not run stable.
|
||||
|
||||
Instead:
|
||||
|
||||
1. fix the issue on `release/X.Y.Z`
|
||||
2. publish another canary
|
||||
3. rerun smoke testing
|
||||
1. fix the issue on `master`
|
||||
2. merge the fix
|
||||
3. wait for the next automatic canary
|
||||
4. rerun smoke testing
|
||||
|
||||
### If stable npm publish succeeds but push or GitHub release creation fails
|
||||
### If stable npm publish succeeds but tag push or GitHub release creation fails
|
||||
|
||||
This is a partial release. npm is already live.
|
||||
|
||||
Do this immediately:
|
||||
|
||||
1. fix the git or GitHub issue from the same checkout
|
||||
2. push the stable branch commit and tag
|
||||
3. create the GitHub Release
|
||||
1. push the missing tag
|
||||
2. rerun `PUBLISH_REMOTE=public-gh ./scripts/create-github-release.sh YYYY.MDD.P`
|
||||
3. verify the GitHub Release notes point at `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Do not republish the same version.
|
||||
|
||||
### If `latest` is broken after stable publish
|
||||
|
||||
Preview:
|
||||
Roll back the dist-tag:
|
||||
|
||||
```bash
|
||||
./scripts/rollback-latest.sh X.Y.Z --dry-run
|
||||
./scripts/rollback-latest.sh YYYY.MDD.P
|
||||
```
|
||||
|
||||
Roll back:
|
||||
Then fix forward with a new stable release.
|
||||
|
||||
```bash
|
||||
./scripts/rollback-latest.sh X.Y.Z
|
||||
```
|
||||
## Related Files
|
||||
|
||||
This does not unpublish anything. It only moves the `latest` dist-tag back to the last good stable release.
|
||||
|
||||
Then fix forward with a new patch release.
|
||||
|
||||
### If the GitHub Release notes are wrong
|
||||
|
||||
Re-run:
|
||||
|
||||
```bash
|
||||
./scripts/create-github-release.sh X.Y.Z
|
||||
```
|
||||
|
||||
If the release already exists, the script updates it.
|
||||
|
||||
## Related Docs
|
||||
|
||||
- [doc/PUBLISHING.md](PUBLISHING.md) — low-level npm build and packaging internals
|
||||
- [.agents/skills/release/SKILL.md](../.agents/skills/release/SKILL.md) — maintainer release coordination workflow
|
||||
- [.agents/skills/release-changelog/SKILL.md](../.agents/skills/release-changelog/SKILL.md) — stable changelog drafting workflow
|
||||
- [`scripts/release.sh`](../scripts/release.sh)
|
||||
- [`scripts/release-package-map.mjs`](../scripts/release-package-map.mjs)
|
||||
- [`scripts/create-github-release.sh`](../scripts/create-github-release.sh)
|
||||
- [`scripts/rollback-latest.sh`](../scripts/rollback-latest.sh)
|
||||
- [`doc/PUBLISHING.md`](PUBLISHING.md)
|
||||
- [`doc/RELEASE-AUTOMATION-SETUP.md`](RELEASE-AUTOMATION-SETUP.md)
|
||||
|
||||
172
doc/memory-landscape.md
Normal file
172
doc/memory-landscape.md
Normal file
@@ -0,0 +1,172 @@
|
||||
# Memory Landscape
|
||||
|
||||
Date: 2026-03-17
|
||||
|
||||
This document summarizes the memory systems referenced in task `PAP-530` and extracts the design patterns that matter for Paperclip.
|
||||
|
||||
## What Paperclip Needs From This Survey
|
||||
|
||||
Paperclip is not trying to become a single opinionated memory engine. The more useful target is a control-plane memory surface that:
|
||||
|
||||
- stays company-scoped
|
||||
- lets each company choose a default memory provider
|
||||
- lets specific agents override that default
|
||||
- keeps provenance back to Paperclip runs, issues, comments, and documents
|
||||
- records memory-related cost and latency the same way the rest of the control plane records work
|
||||
- works with plugin-provided providers, not only built-ins
|
||||
|
||||
The question is not "which memory project wins?" The question is "what is the smallest Paperclip contract that can sit above several very different memory systems without flattening away the useful differences?"
|
||||
|
||||
## Quick Grouping
|
||||
|
||||
### Hosted memory APIs
|
||||
|
||||
- `mem0`
|
||||
- `supermemory`
|
||||
- `Memori`
|
||||
|
||||
These optimize for a simple application integration story: send conversation/content plus an identity, then query for relevant memory or user context later.
|
||||
|
||||
### Agent-centric memory frameworks / memory OSes
|
||||
|
||||
- `MemOS`
|
||||
- `memU`
|
||||
- `EverMemOS`
|
||||
- `OpenViking`
|
||||
|
||||
These treat memory as an agent runtime subsystem, not only as a search index. They usually add task memory, profiles, filesystem-style organization, async ingestion, or skill/resource management.
|
||||
|
||||
### Local-first memory stores / indexes
|
||||
|
||||
- `nuggets`
|
||||
- `memsearch`
|
||||
|
||||
These emphasize local persistence, inspectability, and low operational overhead. They are useful because Paperclip is local-first today and needs at least one zero-config path.
|
||||
|
||||
## Per-Project Notes
|
||||
|
||||
| Project | Shape | Notable API / model | Strong fit for Paperclip | Main mismatch |
|
||||
|---|---|---|---|---|
|
||||
| [nuggets](https://github.com/NeoVertex1/nuggets) | local memory engine + messaging gateway | topic-scoped HRR memory with `remember`, `recall`, `forget`, fact promotion into `MEMORY.md` | good example of lightweight local memory and automatic promotion | very specific architecture; not a general multi-tenant service |
|
||||
| [mem0](https://github.com/mem0ai/mem0) | hosted + OSS SDK | `add`, `search`, `getAll`, `get`, `update`, `delete`, `deleteAll`; entity partitioning via `user_id`, `agent_id`, `run_id`, `app_id` | closest to a clean provider API with identities and metadata filters | provider owns extraction heavily; Paperclip should not assume every backend behaves like mem0 |
|
||||
| [MemOS](https://github.com/MemTensor/MemOS) | memory OS / framework | unified add-retrieve-edit-delete, memory cubes, multimodal memory, tool memory, async scheduler, feedback/correction | strong source for optional capabilities beyond plain search | much broader than the minimal contract Paperclip should standardize first |
|
||||
| [supermemory](https://github.com/supermemoryai/supermemory) | hosted memory + context API | `add`, `profile`, `search.memories`, `search.documents`, document upload, settings; automatic profile building and forgetting | strong example of "context bundle" rather than raw search results | heavily productized around its own ontology and hosted flow |
|
||||
| [memU](https://github.com/NevaMind-AI/memU) | proactive agent memory framework | file-system metaphor, proactive loop, intent prediction, always-on companion model | good source for when memory should trigger agent behavior, not just retrieval | proactive assistant framing is broader than Paperclip's task-centric control plane |
|
||||
| [Memori](https://github.com/MemoriLabs/Memori) | hosted memory fabric + SDK wrappers | registers against LLM SDKs, attribution via `entity_id` + `process_id`, sessions, cloud + BYODB | strong example of automatic capture around model clients | wrapper-centric design does not map 1:1 to Paperclip's run / issue / comment lifecycle |
|
||||
| [EverMemOS](https://github.com/EverMind-AI/EverMemOS) | conversational long-term memory system | MemCell extraction, structured narratives, user profiles, hybrid retrieval / reranking | useful model for provenance-rich structured memories and evolving profiles | focused on conversational memory rather than generalized control-plane events |
|
||||
| [memsearch](https://github.com/zilliztech/memsearch) | markdown-first local memory index | markdown as source of truth, `index`, `search`, `watch`, transcript parsing, plugin hooks | excellent baseline for a local built-in provider and inspectable provenance | intentionally simple; no hosted service semantics or rich correction workflow |
|
||||
| [OpenViking](https://github.com/volcengine/OpenViking) | context database | filesystem-style organization of memories/resources/skills, tiered loading, visualized retrieval trajectories | strong source for browse/inspect UX and context provenance | treats "context database" as a larger product surface than Paperclip should own |
|
||||
|
||||
## Common Primitives Across The Landscape
|
||||
|
||||
Even though the systems disagree on architecture, they converge on a few primitives:
|
||||
|
||||
- `ingest`: add memory from text, messages, documents, or transcripts
|
||||
- `query`: search or retrieve memory given a task, question, or scope
|
||||
- `scope`: partition memory by user, agent, project, process, or session
|
||||
- `provenance`: carry enough metadata to explain where a memory came from
|
||||
- `maintenance`: update, forget, dedupe, compact, or correct memories over time
|
||||
- `context assembly`: turn raw memories into a prompt-ready bundle for the agent
|
||||
|
||||
If Paperclip does not expose these, it will not adapt well to the systems above.
|
||||
|
||||
## Where The Systems Differ
|
||||
|
||||
These differences are exactly why Paperclip needs a layered contract instead of a single hard-coded engine.
|
||||
|
||||
### 1. Who owns extraction?
|
||||
|
||||
- `mem0`, `supermemory`, and `Memori` expect the provider to infer memories from conversations.
|
||||
- `memsearch` expects the host to decide what markdown to write, then indexes it.
|
||||
- `MemOS`, `memU`, `EverMemOS`, and `OpenViking` sit somewhere in between and often expose richer memory construction pipelines.
|
||||
|
||||
Paperclip should support both:
|
||||
|
||||
- provider-managed extraction
|
||||
- Paperclip-managed extraction with provider-managed storage / retrieval
|
||||
|
||||
### 2. What is the source of truth?
|
||||
|
||||
- `memsearch` and `nuggets` make the source inspectable on disk.
|
||||
- hosted APIs often make the provider store canonical.
|
||||
- filesystem-style systems like `OpenViking` and `memU` treat hierarchy itself as part of the memory model.
|
||||
|
||||
Paperclip should not require a single storage shape. It should require normalized references back to Paperclip entities.
|
||||
|
||||
### 3. Is memory just search, or also profile and planning state?
|
||||
|
||||
- `mem0` and `memsearch` center search and CRUD.
|
||||
- `supermemory` adds user profiles as a first-class output.
|
||||
- `MemOS`, `memU`, `EverMemOS`, and `OpenViking` expand into tool traces, task memory, resources, and skills.
|
||||
|
||||
Paperclip should make plain search the minimum contract and richer outputs optional capabilities.
|
||||
|
||||
### 4. Is memory synchronous or asynchronous?
|
||||
|
||||
- local tools often work synchronously in-process.
|
||||
- larger systems add schedulers, background indexing, compaction, or sync jobs.
|
||||
|
||||
Paperclip needs both direct request/response operations and background maintenance hooks.
|
||||
|
||||
## Paperclip-Specific Takeaways
|
||||
|
||||
### Paperclip should own these concerns
|
||||
|
||||
- binding a provider to a company and optionally overriding it per agent
|
||||
- mapping Paperclip entities into provider scopes
|
||||
- provenance back to issue comments, documents, runs, and activity
|
||||
- cost / token / latency reporting for memory work
|
||||
- browse and inspect surfaces in the Paperclip UI
|
||||
- governance on destructive operations
|
||||
|
||||
### Providers should own these concerns
|
||||
|
||||
- extraction heuristics
|
||||
- embedding / indexing strategy
|
||||
- ranking and reranking
|
||||
- profile synthesis
|
||||
- contradiction resolution and forgetting logic
|
||||
- storage engine details
|
||||
|
||||
### The control-plane contract should stay small
|
||||
|
||||
Paperclip does not need to standardize every feature from every provider. It needs:
|
||||
|
||||
- a required portable core
|
||||
- optional capability flags for richer providers
|
||||
- a way to record provider-native ids and metadata without pretending all providers are equivalent internally
|
||||
|
||||
## Recommended Direction
|
||||
|
||||
Paperclip should adopt a two-layer memory model:
|
||||
|
||||
1. `Memory binding + control plane layer`
|
||||
Paperclip decides which provider key is in effect for a company, agent, or project, and it logs every memory operation with provenance and usage.
|
||||
|
||||
2. `Provider adapter layer`
|
||||
A built-in or plugin-supplied adapter turns Paperclip memory requests into provider-specific calls.
|
||||
|
||||
The portable core should cover:
|
||||
|
||||
- ingest / write
|
||||
- search / recall
|
||||
- browse / inspect
|
||||
- get by provider record handle
|
||||
- forget / correction
|
||||
- usage reporting
|
||||
|
||||
Optional capabilities can cover:
|
||||
|
||||
- profile synthesis
|
||||
- async ingestion
|
||||
- multimodal content
|
||||
- tool / resource / skill memory
|
||||
- provider-native graph browsing
|
||||
|
||||
That is enough to support:
|
||||
|
||||
- a local markdown-first baseline similar to `memsearch`
|
||||
- hosted services similar to `mem0`, `supermemory`, or `Memori`
|
||||
- richer agent-memory systems like `MemOS` or `OpenViking`
|
||||
|
||||
without forcing Paperclip itself to become a monolithic memory engine.
|
||||
424
doc/plans/2026-03-17-docker-release-browser-e2e.md
Normal file
424
doc/plans/2026-03-17-docker-release-browser-e2e.md
Normal file
@@ -0,0 +1,424 @@
|
||||
# Docker Release Browser E2E Plan
|
||||
|
||||
## Context
|
||||
|
||||
Today release smoke testing for published Paperclip packages is manual and shell-driven:
|
||||
|
||||
```sh
|
||||
HOST_PORT=3232 DATA_DIR=./data/release-smoke-canary PAPERCLIPAI_VERSION=canary ./scripts/docker-onboard-smoke.sh
|
||||
HOST_PORT=3233 DATA_DIR=./data/release-smoke-stable PAPERCLIPAI_VERSION=latest ./scripts/docker-onboard-smoke.sh
|
||||
```
|
||||
|
||||
That is useful because it exercises the same public install surface users hit:
|
||||
|
||||
- Docker
|
||||
- `npx paperclipai@canary`
|
||||
- `npx paperclipai@latest`
|
||||
- authenticated bootstrap flow
|
||||
|
||||
But it still leaves the most important release questions to a human with a browser:
|
||||
|
||||
- can I sign in with the smoke credentials?
|
||||
- do I land in onboarding?
|
||||
- can I complete onboarding?
|
||||
- does the initial CEO agent actually get created and run?
|
||||
|
||||
The repo already has two adjacent pieces:
|
||||
|
||||
- `tests/e2e/onboarding.spec.ts` covers the onboarding wizard against the local source tree
|
||||
- `scripts/docker-onboard-smoke.sh` boots a published Docker install and auto-bootstraps authenticated mode, but only verifies the API/session layer
|
||||
|
||||
What is missing is one deterministic browser test that joins those two paths.
|
||||
|
||||
## Goal
|
||||
|
||||
Add a release-grade Docker-backed browser E2E that validates the published `canary` and `latest` installs end to end:
|
||||
|
||||
1. boot the published package in Docker
|
||||
2. sign in with known smoke credentials
|
||||
3. verify the user is routed into onboarding
|
||||
4. complete onboarding in the browser
|
||||
5. verify the first CEO agent exists
|
||||
6. verify the initial CEO run was triggered and reached a terminal or active state
|
||||
|
||||
Then wire that test into GitHub Actions so release validation is no longer manual-only.
|
||||
|
||||
## Recommendation In One Sentence
|
||||
|
||||
Turn the current Docker smoke script into a machine-friendly test harness, add a dedicated Playwright release-smoke spec that drives the authenticated browser flow against published Docker installs, and run it in GitHub Actions for both `canary` and `latest`.
|
||||
|
||||
## What We Have Today
|
||||
|
||||
### Existing local browser coverage
|
||||
|
||||
`tests/e2e/onboarding.spec.ts` already proves the onboarding wizard can:
|
||||
|
||||
- create a company
|
||||
- create a CEO agent
|
||||
- create an initial issue
|
||||
- optionally observe task progress
|
||||
|
||||
That is a good base, but it does not validate the public npm package, Docker path, authenticated login flow, or release dist-tags.
|
||||
|
||||
### Existing Docker smoke coverage
|
||||
|
||||
`scripts/docker-onboard-smoke.sh` already does useful setup work:
|
||||
|
||||
- builds `Dockerfile.onboard-smoke`
|
||||
- runs `paperclipai@${PAPERCLIPAI_VERSION}` inside Docker
|
||||
- waits for health
|
||||
- signs up or signs in a smoke admin user
|
||||
- generates and accepts the bootstrap CEO invite in authenticated mode
|
||||
- verifies a board session and `/api/companies`
|
||||
|
||||
That means the hard bootstrap problem is mostly solved already. The main gap is that the script is human-oriented and never hands control to a browser test.
|
||||
|
||||
### Existing CI shape
|
||||
|
||||
The repo already has:
|
||||
|
||||
- `.github/workflows/e2e.yml` for manual Playwright runs against local source
|
||||
- `.github/workflows/release.yml` for canary publish on `master` and manual stable promotion
|
||||
|
||||
So the right move is to extend the current test/release system, not create a parallel one.
|
||||
|
||||
## Product Decision
|
||||
|
||||
### 1. The release smoke should stay deterministic and token-free
|
||||
|
||||
The first version should not require OpenAI, Anthropic, or external agent credentials.
|
||||
|
||||
Use the onboarding flow with a deterministic adapter that can run on a stock GitHub runner and inside the published Docker install. The existing `process` adapter with a trivial command is the right base path for this release gate.
|
||||
|
||||
That keeps this test focused on:
|
||||
|
||||
- release packaging
|
||||
- auth/bootstrap
|
||||
- UI routing
|
||||
- onboarding contract
|
||||
- agent creation
|
||||
- heartbeat invocation plumbing
|
||||
|
||||
Later we can add a second credentialed smoke lane for real model-backed agents.
|
||||
|
||||
### 2. Smoke credentials become an explicit test contract
|
||||
|
||||
The current defaults in `scripts/docker-onboard-smoke.sh` should be treated as stable test fixtures:
|
||||
|
||||
- email: `smoke-admin@paperclip.local`
|
||||
- password: `paperclip-smoke-password`
|
||||
|
||||
The browser test should log in with those exact values unless overridden by env vars.
|
||||
|
||||
### 3. Published-package smoke and source-tree E2E stay separate
|
||||
|
||||
Keep two lanes:
|
||||
|
||||
- source-tree E2E for feature development
|
||||
- published Docker release smoke for release confidence
|
||||
|
||||
They overlap on onboarding assertions, but they guard different failure classes.
|
||||
|
||||
## Proposed Design
|
||||
|
||||
## 1. Add a CI-friendly Docker smoke harness
|
||||
|
||||
Refactor `scripts/docker-onboard-smoke.sh` so it can run in two modes:
|
||||
|
||||
- interactive mode
|
||||
- current behavior
|
||||
- streams logs and waits in foreground for manual inspection
|
||||
- CI mode
|
||||
- starts the container
|
||||
- waits for health and authenticated bootstrap
|
||||
- prints machine-readable metadata
|
||||
- exits while leaving the container running for Playwright
|
||||
|
||||
Recommended shape:
|
||||
|
||||
- keep `scripts/docker-onboard-smoke.sh` as the public entry point
|
||||
- add a `SMOKE_DETACH=true` or `--detach` mode
|
||||
- emit a JSON blob or `.env` file containing:
|
||||
- `SMOKE_BASE_URL`
|
||||
- `SMOKE_ADMIN_EMAIL`
|
||||
- `SMOKE_ADMIN_PASSWORD`
|
||||
- `SMOKE_CONTAINER_NAME`
|
||||
- `SMOKE_DATA_DIR`
|
||||
|
||||
The workflow and Playwright tests can then consume the emitted metadata instead of scraping logs.
|
||||
|
||||
### Why this matters
|
||||
|
||||
The current script always tails logs and then blocks on `wait "$LOG_PID"`. That is convenient for manual smoke testing, but it is the wrong shape for CI orchestration.
|
||||
|
||||
## 2. Add a dedicated Playwright release-smoke spec
|
||||
|
||||
Create a second Playwright entry point specifically for published Docker installs, for example:
|
||||
|
||||
- `tests/release-smoke/playwright.config.ts`
|
||||
- `tests/release-smoke/docker-auth-onboarding.spec.ts`
|
||||
|
||||
This suite should not use Playwright `webServer`, because the app server will already be running inside Docker.
|
||||
|
||||
### Browser scenario
|
||||
|
||||
The first release-smoke scenario should validate:
|
||||
|
||||
1. open `/`
|
||||
2. unauthenticated user is redirected to `/auth`
|
||||
3. sign in using the smoke credentials
|
||||
4. authenticated user lands on onboarding when no companies exist
|
||||
5. onboarding wizard appears with the expected step labels
|
||||
6. create a company
|
||||
7. create the first agent using `process`
|
||||
8. create the initial issue
|
||||
9. finish onboarding and open the created issue
|
||||
10. verify via API:
|
||||
- company exists
|
||||
- CEO agent exists
|
||||
- issue exists and is assigned to the CEO
|
||||
11. verify the first heartbeat run was triggered:
|
||||
- either by checking issue status changed from initial state, or
|
||||
- by checking agent/runs API shows a run for the CEO, or
|
||||
- both
|
||||
|
||||
The test should tolerate the run completing quickly. For this reason, the assertion should accept:
|
||||
|
||||
- `queued`
|
||||
- `running`
|
||||
- `succeeded`
|
||||
|
||||
and similarly for issue progression if the issue status changes before the assertion runs.
|
||||
|
||||
### Why a separate spec instead of reusing `tests/e2e/onboarding.spec.ts`
|
||||
|
||||
The local-source test and release-smoke test have different assumptions:
|
||||
|
||||
- different server lifecycle
|
||||
- different auth path
|
||||
- different deployment mode
|
||||
- published npm package instead of local workspace code
|
||||
|
||||
Trying to force both through one spec will make both worse.
|
||||
|
||||
## 3. Add a release-smoke workflow in GitHub Actions
|
||||
|
||||
Add a workflow dedicated to this surface, ideally reusable:
|
||||
|
||||
- `.github/workflows/release-smoke.yml`
|
||||
|
||||
Recommended triggers:
|
||||
|
||||
- `workflow_dispatch`
|
||||
- `workflow_call`
|
||||
|
||||
Recommended inputs:
|
||||
|
||||
- `paperclip_version`
|
||||
- `canary` or `latest`
|
||||
- `host_port`
|
||||
- optional, default runner-safe port
|
||||
- `artifact_name`
|
||||
- optional for clearer uploads
|
||||
|
||||
### Job outline
|
||||
|
||||
1. checkout repo
|
||||
2. install Node/pnpm
|
||||
3. install Playwright browser dependencies
|
||||
4. launch Docker smoke harness in detached mode with the chosen dist-tag
|
||||
5. run the release-smoke Playwright suite against the returned base URL
|
||||
6. always collect diagnostics:
|
||||
- Playwright report
|
||||
- screenshots
|
||||
- trace
|
||||
- `docker logs`
|
||||
- harness metadata file
|
||||
7. stop and remove container
|
||||
|
||||
### Why a reusable workflow
|
||||
|
||||
This lets us:
|
||||
|
||||
- run the smoke manually on demand
|
||||
- call it from `release.yml`
|
||||
- reuse the same job for both `canary` and `latest`
|
||||
|
||||
## 4. Integrate it into release automation incrementally
|
||||
|
||||
### Phase A: Manual workflow only
|
||||
|
||||
First ship the workflow as manual-only so the harness and test can be stabilized without blocking releases.
|
||||
|
||||
### Phase B: Run automatically after canary publish
|
||||
|
||||
After `publish_canary` succeeds in `.github/workflows/release.yml`, call the reusable release-smoke workflow with:
|
||||
|
||||
- `paperclip_version=canary`
|
||||
|
||||
This proves the just-published public canary really boots and onboards.
|
||||
|
||||
### Phase C: Run automatically after stable publish
|
||||
|
||||
After `publish_stable` succeeds, call the same workflow with:
|
||||
|
||||
- `paperclip_version=latest`
|
||||
|
||||
This gives us post-publish confirmation that the stable dist-tag is healthy.
|
||||
|
||||
### Important nuance
|
||||
|
||||
Testing `latest` from npm cannot happen before stable publish, because the package under test does not exist under `latest` yet. So the `latest` smoke is a post-publish verification, not a pre-publish gate.
|
||||
|
||||
If we later want a true pre-publish stable gate, that should be a separate source-ref or locally built package smoke job.
|
||||
|
||||
## 5. Make diagnostics first-class
|
||||
|
||||
This workflow is only valuable if failures are fast to debug.
|
||||
|
||||
Always capture:
|
||||
|
||||
- Playwright HTML report
|
||||
- Playwright trace on failure
|
||||
- final screenshot on failure
|
||||
- full `docker logs` output
|
||||
- emitted smoke metadata
|
||||
- optional `curl /api/health` snapshot
|
||||
|
||||
Without that, the test will become a flaky black box and people will stop trusting it.
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
## Phase 1: Harness refactor
|
||||
|
||||
Files:
|
||||
|
||||
- `scripts/docker-onboard-smoke.sh`
|
||||
- optionally `scripts/lib/docker-onboard-smoke.sh` or similar helper
|
||||
- `doc/DOCKER.md`
|
||||
- `doc/RELEASING.md`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Add detached/CI mode to the Docker smoke script.
|
||||
2. Make the script emit machine-readable connection metadata.
|
||||
3. Keep the current interactive manual mode intact.
|
||||
4. Add reliable cleanup commands for CI.
|
||||
|
||||
Acceptance:
|
||||
|
||||
- a script invocation can start the published Docker app, auto-bootstrap it, and return control to the caller with enough metadata for browser automation
|
||||
|
||||
## Phase 2: Browser release-smoke suite
|
||||
|
||||
Files:
|
||||
|
||||
- `tests/release-smoke/playwright.config.ts`
|
||||
- `tests/release-smoke/docker-auth-onboarding.spec.ts`
|
||||
- root `package.json`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Add a dedicated Playwright config for external server testing.
|
||||
2. Implement login + onboarding + CEO creation flow.
|
||||
3. Assert a CEO run was created or completed.
|
||||
4. Add a root script such as:
|
||||
- `test:release-smoke`
|
||||
|
||||
Acceptance:
|
||||
|
||||
- the suite passes locally against both:
|
||||
- `PAPERCLIPAI_VERSION=canary`
|
||||
- `PAPERCLIPAI_VERSION=latest`
|
||||
|
||||
## Phase 3: GitHub Actions workflow
|
||||
|
||||
Files:
|
||||
|
||||
- `.github/workflows/release-smoke.yml`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Add manual and reusable workflow entry points.
|
||||
2. Install Chromium and runner dependencies.
|
||||
3. Start Docker smoke in detached mode.
|
||||
4. Run the release-smoke Playwright suite.
|
||||
5. Upload diagnostics artifacts.
|
||||
|
||||
Acceptance:
|
||||
|
||||
- a maintainer can run the workflow manually for either `canary` or `latest`
|
||||
|
||||
## Phase 4: Release workflow integration
|
||||
|
||||
Files:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
- `doc/RELEASING.md`
|
||||
|
||||
Tasks:
|
||||
|
||||
1. Trigger release smoke automatically after canary publish.
|
||||
2. Trigger release smoke automatically after stable publish.
|
||||
3. Document expected behavior and failure handling.
|
||||
|
||||
Acceptance:
|
||||
|
||||
- canary releases automatically produce a published-package browser smoke result
|
||||
- stable releases automatically produce a `latest` browser smoke result
|
||||
|
||||
## Phase 5: Future extension for real model-backed agent validation
|
||||
|
||||
Not part of the first implementation, but this should be the next layer after the deterministic lane is stable.
|
||||
|
||||
Possible additions:
|
||||
|
||||
- a second Playwright project gated on repo secrets
|
||||
- real `claude_local` or `codex_local` adapter validation in Docker-capable environments
|
||||
- assertion that the CEO posts a real task/comment artifact
|
||||
- stable release holdback until the credentialed lane passes
|
||||
|
||||
This should stay optional until the token-free lane is trustworthy.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
The plan is complete when the implemented system can demonstrate all of the following:
|
||||
|
||||
1. A published `paperclipai@canary` Docker install can be smoke-tested by Playwright in CI.
|
||||
2. A published `paperclipai@latest` Docker install can be smoke-tested by Playwright in CI.
|
||||
3. The test logs into authenticated mode with the smoke credentials.
|
||||
4. The test sees onboarding for a fresh instance.
|
||||
5. The test completes onboarding in the browser.
|
||||
6. The test verifies the initial CEO agent was created.
|
||||
7. The test verifies at least one CEO heartbeat run was triggered.
|
||||
8. Failures produce actionable artifacts rather than just a red job.
|
||||
|
||||
## Risks And Decisions To Make
|
||||
|
||||
### 1. Fast process runs may finish before the UI visibly updates
|
||||
|
||||
That is expected. The assertions should prefer API polling for run existence/status rather than only visual indicators.
|
||||
|
||||
### 2. `latest` smoke is post-publish, not preventive
|
||||
|
||||
This is a real limitation of testing the published dist-tag itself. It is still valuable, but it should not be confused with a pre-publish gate.
|
||||
|
||||
### 3. We should not overcouple the test to cosmetic onboarding text
|
||||
|
||||
The important contract is flow success, created entities, and run creation. Use visible labels sparingly and prefer stable semantic selectors where possible.
|
||||
|
||||
### 4. Keep the smoke adapter path boring
|
||||
|
||||
For release safety, the first test should use the most boring runnable adapter possible. This is not the place to validate every adapter.
|
||||
|
||||
## Recommended First Slice
|
||||
|
||||
If we want the fastest path to value, ship this in order:
|
||||
|
||||
1. add detached mode to `scripts/docker-onboard-smoke.sh`
|
||||
2. add one Playwright spec for authenticated login + onboarding + CEO run verification
|
||||
3. add manual `release-smoke.yml`
|
||||
4. once stable, wire canary into `release.yml`
|
||||
5. after that, wire stable `latest` smoke into `release.yml`
|
||||
|
||||
That gives release confidence quickly without turning the first version into a large CI redesign.
|
||||
426
doc/plans/2026-03-17-memory-service-surface-api.md
Normal file
426
doc/plans/2026-03-17-memory-service-surface-api.md
Normal file
@@ -0,0 +1,426 @@
|
||||
# Paperclip Memory Service Plan
|
||||
|
||||
## Goal
|
||||
|
||||
Define a Paperclip memory service and surface API that can sit above multiple memory backends, while preserving Paperclip's control-plane requirements:
|
||||
|
||||
- company scoping
|
||||
- auditability
|
||||
- provenance back to Paperclip work objects
|
||||
- budget / cost visibility
|
||||
- plugin-first extensibility
|
||||
|
||||
This plan is based on the external landscape summarized in `doc/memory-landscape.md` and on the current Paperclip architecture in:
|
||||
|
||||
- `doc/SPEC-implementation.md`
|
||||
- `doc/plugins/PLUGIN_SPEC.md`
|
||||
- `doc/plugins/PLUGIN_AUTHORING_GUIDE.md`
|
||||
- `packages/plugins/sdk/src/types.ts`
|
||||
|
||||
## Recommendation In One Sentence
|
||||
|
||||
Paperclip should not embed one opinionated memory engine into core. It should add a company-scoped memory control plane with a small normalized adapter contract, then let built-ins and plugins implement the provider-specific behavior.
|
||||
|
||||
## Product Decisions
|
||||
|
||||
### 1. Memory is company-scoped by default
|
||||
|
||||
Every memory binding belongs to exactly one company.
|
||||
|
||||
That binding can then be:
|
||||
|
||||
- the company default
|
||||
- an agent override
|
||||
- a project override later if we need it
|
||||
|
||||
No cross-company memory sharing in the initial design.
|
||||
|
||||
### 2. Providers are selected by key
|
||||
|
||||
Each configured memory provider gets a stable key inside a company, for example:
|
||||
|
||||
- `default`
|
||||
- `mem0-prod`
|
||||
- `local-markdown`
|
||||
- `research-kb`
|
||||
|
||||
Agents and services resolve the active provider by key, not by hard-coded vendor logic.
|
||||
|
||||
### 3. Plugins are the primary provider path
|
||||
|
||||
Built-ins are useful for a zero-config local path, but most providers should arrive through the existing Paperclip plugin runtime.
|
||||
|
||||
That keeps the core small and matches the current direction that optional knowledge-like systems live at the edges.
|
||||
|
||||
### 4. Paperclip owns routing, provenance, and accounting
|
||||
|
||||
Providers should not decide how Paperclip entities map to governance.
|
||||
|
||||
Paperclip core should own:
|
||||
|
||||
- who is allowed to call a memory operation
|
||||
- which company / agent / project scope is active
|
||||
- what issue / run / comment / document the operation belongs to
|
||||
- how usage gets recorded
|
||||
|
||||
### 5. Automatic memory should be narrow at first
|
||||
|
||||
Automatic capture is useful, but broad silent capture is dangerous.
|
||||
|
||||
Initial automatic hooks should be:
|
||||
|
||||
- post-run capture from agent runs
|
||||
- issue comment / document capture when the binding enables it
|
||||
- pre-run recall for agent context hydration
|
||||
|
||||
Everything else should start explicit.
|
||||
|
||||
## Proposed Concepts
|
||||
|
||||
### Memory provider
|
||||
|
||||
A built-in or plugin-supplied implementation that stores and retrieves memory.
|
||||
|
||||
Examples:
|
||||
|
||||
- local markdown + vector index
|
||||
- mem0 adapter
|
||||
- supermemory adapter
|
||||
- MemOS adapter
|
||||
|
||||
### Memory binding
|
||||
|
||||
A company-scoped configuration record that points to a provider and carries provider-specific config.
|
||||
|
||||
This is the object selected by key.
|
||||
|
||||
### Memory scope
|
||||
|
||||
The normalized Paperclip scope passed into a provider request.
|
||||
|
||||
At minimum:
|
||||
|
||||
- `companyId`
|
||||
- optional `agentId`
|
||||
- optional `projectId`
|
||||
- optional `issueId`
|
||||
- optional `runId`
|
||||
- optional `subjectId` for external/user identity
|
||||
|
||||
### Memory source reference
|
||||
|
||||
The provenance handle that explains where a memory came from.
|
||||
|
||||
Supported source kinds should include:
|
||||
|
||||
- `issue_comment`
|
||||
- `issue_document`
|
||||
- `issue`
|
||||
- `run`
|
||||
- `activity`
|
||||
- `manual_note`
|
||||
- `external_document`
|
||||
|
||||
### Memory operation
|
||||
|
||||
A normalized write, query, browse, or delete action performed through Paperclip.
|
||||
|
||||
Paperclip should log every operation, whether the provider is local or external.
|
||||
|
||||
## Required Adapter Contract
|
||||
|
||||
The required core should be small enough to fit `memsearch`, `mem0`, `Memori`, `MemOS`, or `OpenViking`.
|
||||
|
||||
```ts
|
||||
export interface MemoryAdapterCapabilities {
|
||||
profile?: boolean;
|
||||
browse?: boolean;
|
||||
correction?: boolean;
|
||||
asyncIngestion?: boolean;
|
||||
multimodal?: boolean;
|
||||
providerManagedExtraction?: boolean;
|
||||
}
|
||||
|
||||
export interface MemoryScope {
|
||||
companyId: string;
|
||||
agentId?: string;
|
||||
projectId?: string;
|
||||
issueId?: string;
|
||||
runId?: string;
|
||||
subjectId?: string;
|
||||
}
|
||||
|
||||
export interface MemorySourceRef {
|
||||
kind:
|
||||
| "issue_comment"
|
||||
| "issue_document"
|
||||
| "issue"
|
||||
| "run"
|
||||
| "activity"
|
||||
| "manual_note"
|
||||
| "external_document";
|
||||
companyId: string;
|
||||
issueId?: string;
|
||||
commentId?: string;
|
||||
documentKey?: string;
|
||||
runId?: string;
|
||||
activityId?: string;
|
||||
externalRef?: string;
|
||||
}
|
||||
|
||||
export interface MemoryUsage {
|
||||
provider: string;
|
||||
model?: string;
|
||||
inputTokens?: number;
|
||||
outputTokens?: number;
|
||||
embeddingTokens?: number;
|
||||
costCents?: number;
|
||||
latencyMs?: number;
|
||||
details?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
export interface MemoryWriteRequest {
|
||||
bindingKey: string;
|
||||
scope: MemoryScope;
|
||||
source: MemorySourceRef;
|
||||
content: string;
|
||||
metadata?: Record<string, unknown>;
|
||||
mode?: "append" | "upsert" | "summarize";
|
||||
}
|
||||
|
||||
export interface MemoryRecordHandle {
|
||||
providerKey: string;
|
||||
providerRecordId: string;
|
||||
}
|
||||
|
||||
export interface MemoryQueryRequest {
|
||||
bindingKey: string;
|
||||
scope: MemoryScope;
|
||||
query: string;
|
||||
topK?: number;
|
||||
intent?: "agent_preamble" | "answer" | "browse";
|
||||
metadataFilter?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
export interface MemorySnippet {
|
||||
handle: MemoryRecordHandle;
|
||||
text: string;
|
||||
score?: number;
|
||||
summary?: string;
|
||||
source?: MemorySourceRef;
|
||||
metadata?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
export interface MemoryContextBundle {
|
||||
snippets: MemorySnippet[];
|
||||
profileSummary?: string;
|
||||
usage?: MemoryUsage[];
|
||||
}
|
||||
|
||||
export interface MemoryAdapter {
|
||||
key: string;
|
||||
capabilities: MemoryAdapterCapabilities;
|
||||
write(req: MemoryWriteRequest): Promise<{
|
||||
records?: MemoryRecordHandle[];
|
||||
usage?: MemoryUsage[];
|
||||
}>;
|
||||
query(req: MemoryQueryRequest): Promise<MemoryContextBundle>;
|
||||
get(handle: MemoryRecordHandle, scope: MemoryScope): Promise<MemorySnippet | null>;
|
||||
forget(handles: MemoryRecordHandle[], scope: MemoryScope): Promise<{ usage?: MemoryUsage[] }>;
|
||||
}
|
||||
```
|
||||
|
||||
This contract intentionally does not force a provider to expose its internal graph, filesystem, or ontology.
|
||||
|
||||
## Optional Adapter Surfaces
|
||||
|
||||
These should be capability-gated, not required:
|
||||
|
||||
- `browse(scope, filters)` for file-system / graph / timeline inspection
|
||||
- `correct(handle, patch)` for natural-language correction flows
|
||||
- `profile(scope)` when the provider can synthesize stable preferences or summaries
|
||||
- `sync(source)` for connectors or background ingestion
|
||||
- `explain(queryResult)` for providers that can expose retrieval traces
|
||||
|
||||
## What Paperclip Should Persist
|
||||
|
||||
Paperclip should not mirror the full provider memory corpus into Postgres unless the provider is a Paperclip-managed local provider.
|
||||
|
||||
Paperclip core should persist:
|
||||
|
||||
- memory bindings and overrides
|
||||
- provider keys and capability metadata
|
||||
- normalized memory operation logs
|
||||
- provider record handles returned by operations when available
|
||||
- source references back to issue comments, documents, runs, and activity
|
||||
- usage and cost data
|
||||
|
||||
For external providers, the memory payload itself can remain in the provider.
|
||||
|
||||
## Hook Model
|
||||
|
||||
### Automatic hooks
|
||||
|
||||
These should be low-risk and easy to reason about:
|
||||
|
||||
1. `pre-run hydrate`
|
||||
Before an agent run starts, Paperclip may call `query(... intent = "agent_preamble")` using the active binding.
|
||||
|
||||
2. `post-run capture`
|
||||
After a run finishes, Paperclip may write a summary or transcript-derived note tied to the run.
|
||||
|
||||
3. `issue comment / document capture`
|
||||
When enabled on the binding, Paperclip may capture selected issue comments or issue documents as memory sources.
|
||||
|
||||
### Explicit hooks
|
||||
|
||||
These should be tool- or UI-driven first:
|
||||
|
||||
- `memory.search`
|
||||
- `memory.note`
|
||||
- `memory.forget`
|
||||
- `memory.correct`
|
||||
- `memory.browse`
|
||||
|
||||
### Not automatic in the first version
|
||||
|
||||
- broad web crawling
|
||||
- silent import of arbitrary repo files
|
||||
- cross-company memory sharing
|
||||
- automatic destructive deletion
|
||||
- provider migration between bindings
|
||||
|
||||
## Agent UX Rules
|
||||
|
||||
Paperclip should give agents both automatic recall and explicit tools, with simple guidance:
|
||||
|
||||
- use `memory.search` when the task depends on prior decisions, people, projects, or long-running context that is not in the current issue thread
|
||||
- use `memory.note` when a durable fact, preference, or decision should survive this run
|
||||
- use `memory.correct` when the user explicitly says prior context is wrong
|
||||
- rely on post-run auto-capture for ordinary session residue so agents do not have to write memory notes for every trivial exchange
|
||||
|
||||
This keeps memory available without forcing every agent prompt to become a memory-management protocol.
|
||||
|
||||
## Browse And Inspect Surface
|
||||
|
||||
Paperclip needs a first-class UI for memory, otherwise providers become black boxes.
|
||||
|
||||
The initial browse surface should support:
|
||||
|
||||
- active binding by company and agent
|
||||
- recent memory operations
|
||||
- recent write sources
|
||||
- query results with source backlinks
|
||||
- filters by agent, issue, run, source kind, and date
|
||||
- provider usage / cost / latency summaries
|
||||
|
||||
When a provider supports richer browsing, the plugin can add deeper views through the existing plugin UI surfaces.
|
||||
|
||||
## Cost And Evaluation
|
||||
|
||||
Every adapter response should be able to return usage records.
|
||||
|
||||
Paperclip should roll up:
|
||||
|
||||
- memory inference tokens
|
||||
- embedding tokens
|
||||
- external provider cost
|
||||
- latency
|
||||
- query count
|
||||
- write count
|
||||
|
||||
It should also record evaluation-oriented metrics where possible:
|
||||
|
||||
- recall hit rate
|
||||
- empty query rate
|
||||
- manual correction count
|
||||
- per-binding success / failure counts
|
||||
|
||||
This is important because a memory system that "works" but silently burns budget is not acceptable in Paperclip.
|
||||
|
||||
## Suggested Data Model Additions
|
||||
|
||||
At the control-plane level, the likely new core tables are:
|
||||
|
||||
- `memory_bindings`
|
||||
- company-scoped key
|
||||
- provider id / plugin id
|
||||
- config blob
|
||||
- enabled status
|
||||
|
||||
- `memory_binding_targets`
|
||||
- target type (`company`, `agent`, later `project`)
|
||||
- target id
|
||||
- binding id
|
||||
|
||||
- `memory_operations`
|
||||
- company id
|
||||
- binding id
|
||||
- operation type (`write`, `query`, `forget`, `browse`, `correct`)
|
||||
- scope fields
|
||||
- source refs
|
||||
- usage / latency / cost
|
||||
- success / error
|
||||
|
||||
Provider-specific long-form state should stay in plugin state or the provider itself unless a built-in local provider needs its own schema.
|
||||
|
||||
## Recommended First Built-In
|
||||
|
||||
The best zero-config built-in is a local markdown-first provider with optional semantic indexing.
|
||||
|
||||
Why:
|
||||
|
||||
- it matches Paperclip's local-first posture
|
||||
- it is inspectable
|
||||
- it is easy to back up and debug
|
||||
- it gives the system a baseline even without external API keys
|
||||
|
||||
The design should still treat that built-in as just another provider behind the same control-plane contract.
|
||||
|
||||
## Rollout Phases
|
||||
|
||||
### Phase 1: Control-plane contract
|
||||
|
||||
- add memory binding models and API types
|
||||
- add plugin capability / registration surface for memory providers
|
||||
- add operation logging and usage reporting
|
||||
|
||||
### Phase 2: One built-in + one plugin example
|
||||
|
||||
- ship a local markdown-first provider
|
||||
- ship one hosted adapter example to validate the external-provider path
|
||||
|
||||
### Phase 3: UI inspection
|
||||
|
||||
- add company / agent memory settings
|
||||
- add a memory operation explorer
|
||||
- add source backlinks to issues and runs
|
||||
|
||||
### Phase 4: Automatic hooks
|
||||
|
||||
- pre-run hydrate
|
||||
- post-run capture
|
||||
- selected issue comment / document capture
|
||||
|
||||
### Phase 5: Rich capabilities
|
||||
|
||||
- correction flows
|
||||
- provider-native browse / graph views
|
||||
- project-level overrides if needed
|
||||
- evaluation dashboards
|
||||
|
||||
## Open Questions
|
||||
|
||||
- Should project overrides exist in V1 of the memory service, or should we force company default + agent override first?
|
||||
- Do we want Paperclip-managed extraction pipelines at all, or should built-ins be the only place where Paperclip owns extraction?
|
||||
- Should memory usage extend the current `cost_events` model directly, or should memory operations keep a parallel usage log and roll up into `cost_events` secondarily?
|
||||
- Do we want provider install / binding changes to require approvals for some companies?
|
||||
|
||||
## Bottom Line
|
||||
|
||||
The right abstraction is:
|
||||
|
||||
- Paperclip owns memory bindings, scopes, provenance, governance, and usage reporting.
|
||||
- Providers own extraction, ranking, storage, and provider-native memory semantics.
|
||||
|
||||
That gives Paperclip a stable "memory service" without locking the product to one memory philosophy or one vendor.
|
||||
488
doc/plans/2026-03-17-release-automation-and-versioning.md
Normal file
488
doc/plans/2026-03-17-release-automation-and-versioning.md
Normal file
@@ -0,0 +1,488 @@
|
||||
# Release Automation and Versioning Simplification Plan
|
||||
|
||||
## Context
|
||||
|
||||
Paperclip's current release flow is documented in `doc/RELEASING.md` and implemented through:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
- `scripts/release-lib.sh`
|
||||
- `scripts/release-start.sh`
|
||||
- `scripts/release-preflight.sh`
|
||||
- `scripts/release.sh`
|
||||
- `scripts/create-github-release.sh`
|
||||
|
||||
Today the model is:
|
||||
|
||||
1. pick `patch`, `minor`, or `major`
|
||||
2. create `release/X.Y.Z`
|
||||
3. draft `releases/vX.Y.Z.md`
|
||||
4. publish one or more canaries from that release branch
|
||||
5. publish stable from that same branch
|
||||
6. push tag + create GitHub Release
|
||||
7. merge the release branch back to `master`
|
||||
|
||||
That is workable, but it creates friction in exactly the places that should be cheap:
|
||||
|
||||
- deciding `patch` vs `minor` vs `major`
|
||||
- cutting and carrying release branches
|
||||
- manually publishing canaries
|
||||
- thinking about changelog generation for canaries
|
||||
- handling npm credentials safely in a public repo
|
||||
|
||||
The target state from this discussion is simpler:
|
||||
|
||||
- every push to `master` publishes a canary automatically
|
||||
- stable releases are promoted deliberately from a vetted commit
|
||||
- versioning is date-driven instead of semantics-driven
|
||||
- stable publishing is secure even in a public open-source repository
|
||||
- changelog generation happens only for real stable releases
|
||||
|
||||
## Recommendation In One Sentence
|
||||
|
||||
Move Paperclip to semver-compatible calendar versioning, auto-publish canaries from `master`, promote stable from a chosen tested commit, and use npm trusted publishing plus GitHub environments so no long-lived npm or LLM token needs to live in Actions.
|
||||
|
||||
## Core Decisions
|
||||
|
||||
### 1. Use calendar versions, but keep semver syntax
|
||||
|
||||
The repo and npm tooling still assume semver-shaped version strings in many places. That does not mean Paperclip must keep semver as a product policy. It does mean the version format should remain semver-valid.
|
||||
|
||||
Recommended format:
|
||||
|
||||
- stable: `YYYY.MDD.P`
|
||||
- canary: `YYYY.MDD.P-canary.N`
|
||||
|
||||
Examples:
|
||||
|
||||
- first stable on March 17, 2026: `2026.317.0`
|
||||
- third canary on the `2026.317.0` line: `2026.317.0-canary.2`
|
||||
|
||||
Why this shape:
|
||||
|
||||
- it removes `patch/minor/major` decisions
|
||||
- it is valid semver syntax
|
||||
- it stays compatible with npm, dist-tags, and existing semver validators
|
||||
- it is close to the format you actually want
|
||||
|
||||
Important constraints:
|
||||
|
||||
- the middle numeric slot should be `MDD`, where `M` is the month and `DD` is the zero-padded day
|
||||
- `2026.03.17` is not the format to use
|
||||
- numeric semver identifiers do not allow leading zeroes
|
||||
- `2026.3.17.1` is not the format to use
|
||||
- semver has three numeric components, not four
|
||||
- the practical semver-safe equivalent is `2026.317.0-canary.8`
|
||||
|
||||
This is effectively CalVer on semver rails.
|
||||
|
||||
### 2. Accept that CalVer changes the compatibility contract
|
||||
|
||||
This is not semver in spirit anymore. It is semver in syntax only.
|
||||
|
||||
That tradeoff is probably acceptable for Paperclip, but it should be explicit:
|
||||
|
||||
- consumers no longer infer compatibility from `major/minor/patch`
|
||||
- release notes become the compatibility signal
|
||||
- downstream users should prefer exact pins or deliberate upgrades
|
||||
|
||||
This is especially relevant for public library packages like `@paperclipai/shared`, `@paperclipai/db`, and the adapter packages.
|
||||
|
||||
### 3. Drop release branches for normal publishing
|
||||
|
||||
If every merge to `master` publishes a canary, the current `release/X.Y.Z` train model becomes more ceremony than value.
|
||||
|
||||
Recommended replacement:
|
||||
|
||||
- `master` is the only canary train
|
||||
- every push to `master` can publish a canary
|
||||
- stable is published from a chosen commit or canary tag on `master`
|
||||
|
||||
This matches the workflow you actually want:
|
||||
|
||||
- merge continuously
|
||||
- let npm always have a fresh canary
|
||||
- choose a known-good canary later and promote that commit to stable
|
||||
|
||||
### 4. Promote by source ref, not by "renaming" a canary
|
||||
|
||||
This is the most important mechanical constraint.
|
||||
|
||||
npm can move dist-tags, but it does not let you rename an already-published version. That means:
|
||||
|
||||
- you can move `latest` to `paperclipai@1.2.3`
|
||||
- you cannot turn `paperclipai@2026.317.0-canary.8` into `paperclipai@2026.317.0`
|
||||
|
||||
So "promote canary to stable" really means:
|
||||
|
||||
1. choose the commit or canary tag you trust
|
||||
2. rebuild from that exact commit
|
||||
3. publish it again with the stable version string
|
||||
|
||||
Because of that, the stable workflow should take a source ref, not just a bump type.
|
||||
|
||||
Recommended stable input:
|
||||
|
||||
- `source_ref`
|
||||
- commit SHA, or
|
||||
- a canary git tag such as `canary/v2026.317.1-canary.8`
|
||||
|
||||
### 5. Only stable releases get release notes, tags, and GitHub Releases
|
||||
|
||||
Canaries should stay lightweight:
|
||||
|
||||
- publish to npm under `canary`
|
||||
- optionally create a lightweight or annotated git tag
|
||||
- do not create GitHub Releases
|
||||
- do not require `releases/v*.md`
|
||||
- do not spend LLM tokens
|
||||
|
||||
Stable releases should remain the public narrative surface:
|
||||
|
||||
- git tag `v2026.317.0`
|
||||
- GitHub Release `v2026.317.0`
|
||||
- stable changelog file `releases/v2026.317.0.md`
|
||||
|
||||
## Security Model
|
||||
|
||||
### Recommendation
|
||||
|
||||
Use npm trusted publishing with GitHub Actions OIDC, then disable token-based publishing access for the packages.
|
||||
|
||||
Why:
|
||||
|
||||
- no long-lived `NPM_TOKEN` in repo or org secrets
|
||||
- no personal npm token in Actions
|
||||
- short-lived credentials minted only for the authorized workflow
|
||||
- automatic npm provenance for public packages in public repos
|
||||
|
||||
This is the cleanest answer to the open-repo security concern.
|
||||
|
||||
### Concrete controls
|
||||
|
||||
#### 1. Use one release workflow file
|
||||
|
||||
Use one workflow filename for both canary and stable publishing:
|
||||
|
||||
- `.github/workflows/release.yml`
|
||||
|
||||
Why:
|
||||
|
||||
- npm trusted publishing is configured per workflow filename
|
||||
- npm currently allows one trusted publisher configuration per package
|
||||
- GitHub environments can still provide separate canary/stable approval rules inside the same workflow
|
||||
|
||||
#### 2. Use separate GitHub environments
|
||||
|
||||
Recommended environments:
|
||||
|
||||
- `npm-canary`
|
||||
- `npm-stable`
|
||||
|
||||
Recommended policy:
|
||||
|
||||
- `npm-canary`
|
||||
- allowed branch: `master`
|
||||
- no human reviewer required
|
||||
- `npm-stable`
|
||||
- allowed branch: `master`
|
||||
- required reviewer enabled
|
||||
- prevent self-review enabled
|
||||
- admin bypass disabled
|
||||
|
||||
Stable should require an explicit second human gate even if the workflow is manually dispatched.
|
||||
|
||||
#### 3. Lock down workflow edits
|
||||
|
||||
Add or tighten `CODEOWNERS` coverage for:
|
||||
|
||||
- `.github/workflows/*`
|
||||
- `scripts/release*`
|
||||
- `doc/RELEASING.md`
|
||||
|
||||
This matters because trusted publishing authorizes a workflow file. The biggest remaining risk is not secret exfiltration from forks. It is a maintainer-approved change to the release workflow itself.
|
||||
|
||||
#### 4. Remove traditional npm token access after OIDC works
|
||||
|
||||
After trusted publishing is verified:
|
||||
|
||||
- set package publishing access to require 2FA and disallow tokens
|
||||
- revoke any legacy automation tokens
|
||||
|
||||
That eliminates the "someone stole the npm token" class of failure.
|
||||
|
||||
### What not to do
|
||||
|
||||
- do not put your personal Claude or npm token in GitHub Actions
|
||||
- do not run release logic from `pull_request_target`
|
||||
- do not make stable publishing depend on a repo secret if OIDC can handle it
|
||||
- do not create canary GitHub Releases
|
||||
|
||||
## Changelog Strategy
|
||||
|
||||
### Recommendation
|
||||
|
||||
Generate stable changelogs only, and keep LLM-assisted changelog generation out of CI for now.
|
||||
|
||||
Reasoning:
|
||||
|
||||
- canaries happen too often
|
||||
- canaries do not need polished public notes
|
||||
- putting a personal Claude token into Actions is not worth the risk
|
||||
- stable release cadence is low enough that a human-in-the-loop step is acceptable
|
||||
|
||||
Recommended stable path:
|
||||
|
||||
1. pick a canary commit or tag
|
||||
2. run changelog generation locally from a trusted machine
|
||||
3. commit `releases/vYYYY.MDD.P.md`
|
||||
4. run stable promotion
|
||||
|
||||
If the notes are not ready yet, a fallback is acceptable:
|
||||
|
||||
- publish stable
|
||||
- create a minimal GitHub Release
|
||||
- update `releases/vYYYY.MDD.P.md` immediately afterward
|
||||
|
||||
But the better steady-state is to have the stable notes committed before stable publish.
|
||||
|
||||
### Future option
|
||||
|
||||
If you later want CI-assisted changelog drafting, do it with:
|
||||
|
||||
- a dedicated service account
|
||||
- a token scoped only for changelog generation
|
||||
- a manual workflow
|
||||
- a dedicated environment with required reviewers
|
||||
|
||||
That is phase-two hardening work, not a phase-one requirement.
|
||||
|
||||
## Proposed Future Workflow
|
||||
|
||||
### Canary workflow
|
||||
|
||||
Trigger:
|
||||
|
||||
- `push` on `master`
|
||||
|
||||
Steps:
|
||||
|
||||
1. checkout the merged `master` commit
|
||||
2. run verification on that exact commit
|
||||
3. compute canary version for current UTC date
|
||||
4. version public packages to `YYYY.MDD.P-canary.N`
|
||||
5. publish to npm with dist-tag `canary`
|
||||
6. create a canary git tag for traceability
|
||||
|
||||
Recommended canary tag format:
|
||||
|
||||
- `canary/v2026.317.1-canary.4`
|
||||
|
||||
Outputs:
|
||||
|
||||
- npm canary published
|
||||
- git tag created
|
||||
- no GitHub Release
|
||||
- no changelog file required
|
||||
|
||||
### Stable workflow
|
||||
|
||||
Trigger:
|
||||
|
||||
- `workflow_dispatch`
|
||||
|
||||
Inputs:
|
||||
|
||||
- `source_ref`
|
||||
- optional `stable_date`
|
||||
- `dry_run`
|
||||
|
||||
Steps:
|
||||
|
||||
1. checkout `source_ref`
|
||||
2. run verification on that exact commit
|
||||
3. compute the next stable patch slot for the UTC date or provided override
|
||||
4. fail if `vYYYY.MDD.P` already exists
|
||||
5. require `releases/vYYYY.MDD.P.md`
|
||||
6. version public packages to `YYYY.MDD.P`
|
||||
7. publish to npm under `latest`
|
||||
8. create git tag `vYYYY.MDD.P`
|
||||
9. push tag
|
||||
10. create GitHub Release from `releases/vYYYY.MDD.P.md`
|
||||
|
||||
Outputs:
|
||||
|
||||
- stable npm release
|
||||
- stable git tag
|
||||
- GitHub Release
|
||||
- clean public changelog surface
|
||||
|
||||
## Implementation Guidance
|
||||
|
||||
### 1. Replace bump-type version math with explicit version computation
|
||||
|
||||
The current release scripts depend on:
|
||||
|
||||
- `patch`
|
||||
- `minor`
|
||||
- `major`
|
||||
|
||||
That logic should be replaced with:
|
||||
|
||||
- `compute_canary_version_for_date`
|
||||
- `compute_stable_version_for_date`
|
||||
|
||||
For example:
|
||||
|
||||
- `next_stable_version(2026-03-17) -> 2026.317.0`
|
||||
- `next_canary_for_utc_date(2026-03-17) -> 2026.317.0-canary.0`
|
||||
|
||||
### 2. Stop requiring `release/X.Y.Z`
|
||||
|
||||
These current invariants should be removed from the happy path:
|
||||
|
||||
- "must run from branch `release/X.Y.Z`"
|
||||
- "stable and canary for `X.Y.Z` come from the same release branch"
|
||||
- `release-start.sh`
|
||||
|
||||
Replace them with:
|
||||
|
||||
- canary must run from `master`
|
||||
- stable may run from a pinned `source_ref`
|
||||
|
||||
### 3. Keep Changesets only if it stays helpful
|
||||
|
||||
The current system uses Changesets to:
|
||||
|
||||
- rewrite package versions
|
||||
- maintain package-level `CHANGELOG.md` files
|
||||
- publish packages
|
||||
|
||||
With CalVer, Changesets may still be useful for publish orchestration, but it should no longer own version selection.
|
||||
|
||||
Recommended implementation order:
|
||||
|
||||
1. keep `changeset publish` if it works with explicitly-set versions
|
||||
2. replace version computation with a small explicit versioning script
|
||||
3. if Changesets keeps fighting the model, remove it from release publishing entirely
|
||||
|
||||
Paperclip's release problem is now "publish the whole fixed package set at one explicit version", not "derive the next semantic bump from human intent".
|
||||
|
||||
### 4. Add a dedicated versioning script
|
||||
|
||||
Recommended new script:
|
||||
|
||||
- `scripts/set-release-version.mjs`
|
||||
|
||||
Responsibilities:
|
||||
|
||||
- set the version in all public publishable packages
|
||||
- update any internal exact-version references needed for publishing
|
||||
- update CLI version strings
|
||||
- avoid broad string replacement across unrelated files
|
||||
|
||||
This is safer than keeping a bump-oriented changeset flow and then forcing it into a date-based scheme.
|
||||
|
||||
### 5. Keep rollback based on dist-tags
|
||||
|
||||
`rollback-latest.sh` should stay, but it should stop assuming a semver meaning beyond syntax.
|
||||
|
||||
It should continue to:
|
||||
|
||||
- repoint `latest` to a prior stable version
|
||||
- never unpublish
|
||||
|
||||
## Tradeoffs and Risks
|
||||
|
||||
### 1. The stable patch slot is now part of the version contract
|
||||
|
||||
With `YYYY.MDD.P`, same-day hotfixes are supported, but the stable patch slot is now part of the visible version format.
|
||||
|
||||
That is the right tradeoff because:
|
||||
|
||||
1. npm still gets semver-valid versions
|
||||
2. same-day hotfixes stay possible
|
||||
3. chronological ordering still works as long as the day is zero-padded inside `MDD`
|
||||
|
||||
### 2. Public package consumers lose semver intent signaling
|
||||
|
||||
This is the main downside of CalVer.
|
||||
|
||||
If that becomes a problem, one alternative is:
|
||||
|
||||
- use CalVer for the CLI package only
|
||||
- keep semver for library packages
|
||||
|
||||
That is more complex operationally, so I would not start there unless package consumers actually need it.
|
||||
|
||||
### 3. Auto-canary means more publish traffic
|
||||
|
||||
Publishing on every `master` merge means:
|
||||
|
||||
- more npm versions
|
||||
- more git tags
|
||||
- more registry noise
|
||||
|
||||
That is acceptable if canaries stay clearly separate:
|
||||
|
||||
- npm dist-tag `canary`
|
||||
- no GitHub Release
|
||||
- no external announcement
|
||||
|
||||
## Rollout Plan
|
||||
|
||||
### Phase 1: Security foundation
|
||||
|
||||
1. Create `release.yml`
|
||||
2. Configure npm trusted publishers for all public packages
|
||||
3. Create `npm-canary` and `npm-stable` environments
|
||||
4. Add `CODEOWNERS` protection for release files
|
||||
5. Verify OIDC publishing works
|
||||
6. Disable token-based publishing access and revoke old tokens
|
||||
|
||||
### Phase 2: Canary automation
|
||||
|
||||
1. Add canary workflow on `push` to `master`
|
||||
2. Add explicit calendar-version computation
|
||||
3. Add canary git tagging
|
||||
4. Remove changelog requirement from canaries
|
||||
5. Update `doc/RELEASING.md`
|
||||
|
||||
### Phase 3: Stable promotion
|
||||
|
||||
1. Add manual stable workflow with `source_ref`
|
||||
2. Require stable notes file
|
||||
3. Publish stable + tag + GitHub Release
|
||||
4. Update rollback docs and scripts
|
||||
5. Retire release-branch assumptions
|
||||
|
||||
### Phase 4: Cleanup
|
||||
|
||||
1. Remove `release-start.sh` from the primary path
|
||||
2. Remove `patch/minor/major` from maintainer docs
|
||||
3. Decide whether to keep or remove Changesets from publishing
|
||||
4. Document the CalVer compatibility contract publicly
|
||||
|
||||
## Concrete Recommendation
|
||||
|
||||
Paperclip should adopt this model:
|
||||
|
||||
- stable versions: `YYYY.MDD.P`
|
||||
- canary versions: `YYYY.MDD.P-canary.N`
|
||||
- canaries auto-published on every push to `master`
|
||||
- stables manually promoted from a chosen tested commit or canary tag
|
||||
- no release branches in the default path
|
||||
- no canary changelog files
|
||||
- no canary GitHub Releases
|
||||
- no Claude token in GitHub Actions
|
||||
- no npm automation token in GitHub Actions
|
||||
- npm trusted publishing plus GitHub environments for release security
|
||||
|
||||
That gets rid of the annoying part of semver without fighting npm, makes canaries cheap, keeps stables deliberate, and materially improves the security posture of the public repository.
|
||||
|
||||
## External References
|
||||
|
||||
- npm trusted publishing: https://docs.npmjs.com/trusted-publishers/
|
||||
- npm dist-tags: https://docs.npmjs.com/adding-dist-tags-to-packages/
|
||||
- npm semantic versioning guidance: https://docs.npmjs.com/about-semantic-versioning/
|
||||
- GitHub environments and deployment protection rules: https://docs.github.com/en/actions/how-tos/deploy/configure-and-manage-deployments/manage-environments
|
||||
- GitHub secrets behavior for forks: https://docs.github.com/en/actions/how-tos/write-workflows/choose-what-workflows-do/use-secrets
|
||||
11
package.json
11
package.json
@@ -18,23 +18,22 @@
|
||||
"db:backup": "./scripts/backup-db.sh",
|
||||
"paperclipai": "node cli/node_modules/tsx/dist/cli.mjs cli/src/index.ts",
|
||||
"build:npm": "./scripts/build-npm.sh",
|
||||
"release:start": "./scripts/release-start.sh",
|
||||
"release": "./scripts/release.sh",
|
||||
"release:preflight": "./scripts/release-preflight.sh",
|
||||
"release:canary": "./scripts/release.sh canary",
|
||||
"release:stable": "./scripts/release.sh stable",
|
||||
"release:github": "./scripts/create-github-release.sh",
|
||||
"release:rollback": "./scripts/rollback-latest.sh",
|
||||
"changeset": "changeset",
|
||||
"version-packages": "changeset version",
|
||||
"check:tokens": "node scripts/check-forbidden-tokens.mjs",
|
||||
"docs:dev": "cd docs && npx mintlify dev",
|
||||
"smoke:openclaw-join": "./scripts/smoke/openclaw-join.sh",
|
||||
"smoke:openclaw-docker-ui": "./scripts/smoke/openclaw-docker-ui.sh",
|
||||
"smoke:openclaw-sse-standalone": "./scripts/smoke/openclaw-sse-standalone.sh",
|
||||
"test:e2e": "npx playwright test --config tests/e2e/playwright.config.ts",
|
||||
"test:e2e:headed": "npx playwright test --config tests/e2e/playwright.config.ts --headed"
|
||||
"test:e2e:headed": "npx playwright test --config tests/e2e/playwright.config.ts --headed",
|
||||
"test:release-smoke": "npx playwright test --config tests/release-smoke/playwright.config.ts",
|
||||
"test:release-smoke:headed": "npx playwright test --config tests/release-smoke/playwright.config.ts --headed"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@changesets/cli": "^2.30.0",
|
||||
"cross-env": "^10.1.0",
|
||||
"@playwright/test": "^1.58.2",
|
||||
"esbuild": "^0.27.3",
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
import { createHash } from "node:crypto";
|
||||
import os from "node:os";
|
||||
import type { AdapterModel } from "@paperclipai/adapter-utils";
|
||||
import {
|
||||
asString,
|
||||
@@ -20,7 +21,7 @@ function resolveOpenCodeCommand(input: unknown): string {
|
||||
|
||||
const discoveryCache = new Map<string, { expiresAt: number; models: AdapterModel[] }>();
|
||||
const VOLATILE_ENV_KEY_PREFIXES = ["PAPERCLIP_", "npm_", "NPM_"] as const;
|
||||
const VOLATILE_ENV_KEY_EXACT = new Set(["PWD", "OLDPWD", "SHLVL", "_", "TERM_SESSION_ID"]);
|
||||
const VOLATILE_ENV_KEY_EXACT = new Set(["PWD", "OLDPWD", "SHLVL", "_", "TERM_SESSION_ID", "HOME"]);
|
||||
|
||||
function dedupeModels(models: AdapterModel[]): AdapterModel[] {
|
||||
const seen = new Set<string>();
|
||||
@@ -107,7 +108,19 @@ export async function discoverOpenCodeModels(input: {
|
||||
const command = resolveOpenCodeCommand(input.command);
|
||||
const cwd = asString(input.cwd, process.cwd());
|
||||
const env = normalizeEnv(input.env);
|
||||
const runtimeEnv = normalizeEnv(ensurePathInEnv({ ...process.env, ...env }));
|
||||
// Ensure HOME points to the actual running user's home directory.
|
||||
// When the server is started via `runuser -u <user>`, HOME may still
|
||||
// reflect the parent process (e.g. /root), causing OpenCode to miss
|
||||
// provider auth credentials stored under the target user's home.
|
||||
let resolvedHome: string | undefined;
|
||||
try {
|
||||
resolvedHome = os.userInfo().homedir || undefined;
|
||||
} catch {
|
||||
// os.userInfo() throws a SystemError when the current UID has no
|
||||
// /etc/passwd entry (e.g. `docker run --user 1234` with a minimal
|
||||
// image). Fall back to process.env.HOME.
|
||||
}
|
||||
const runtimeEnv = normalizeEnv(ensurePathInEnv({ ...process.env, ...env, ...(resolvedHome ? { HOME: resolvedHome } : {}) }));
|
||||
|
||||
const result = await runChildProcess(
|
||||
`opencode-models-${Date.now()}-${Math.random().toString(16).slice(2)}`,
|
||||
|
||||
757
pnpm-lock.yaml
generated
757
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
65
releases/v2026.318.0.md
Normal file
65
releases/v2026.318.0.md
Normal file
@@ -0,0 +1,65 @@
|
||||
# v2026.318.0
|
||||
|
||||
> Released: 2026-03-18
|
||||
|
||||
## Highlights
|
||||
|
||||
- **Plugin framework and SDK** — Full plugin system with runtime lifecycle management, CLI tooling, settings UI, breadcrumb and slot extensibility, domain event bridge, and a kitchen-sink example. The Plugin SDK now includes document CRUD methods and a testing harness. ([#904](https://github.com/paperclipai/paperclip/pull/904), [#910](https://github.com/paperclipai/paperclip/pull/910), [#912](https://github.com/paperclipai/paperclip/pull/912), [#909](https://github.com/paperclipai/paperclip/pull/909), [#1074](https://github.com/paperclipai/paperclip/pull/1074), @gsxdsm, @mvanhorn, @residentagent)
|
||||
- **Upgraded costs and budgeting** — Improved cost tracking and budget management surfaces. ([#949](https://github.com/paperclipai/paperclip/pull/949))
|
||||
- **Issue documents and attachments** — Issues now support inline document editing, file staging before creation, deep-linked documents, copy and download actions, and live-event refresh. ([#899](https://github.com/paperclipai/paperclip/pull/899))
|
||||
- **Hermes agent adapter** — New `hermes_local` adapter brings support for the Hermes CLI as an agent backend. ([#587](https://github.com/paperclipai/paperclip/pull/587), @teknium1)
|
||||
- **Execution workspaces (EXPERIMENTAL)** — Isolated execution workspaces for agent runs, including workspace operation tracking, reusable workspace deduplication, and work product management. Project-level workspace policies are configurable. ([#1038](https://github.com/paperclipai/paperclip/pull/1038))
|
||||
- **Heartbeat token optimization** — Heartbeat cycles now skip redundant token usage.
|
||||
|
||||
## Improvements
|
||||
|
||||
- **Session compaction is adapter-aware** — Compaction logic now respects per-adapter context limits.
|
||||
- **Company logos** — Upload and display company logos with SVG sanitization and enhanced security headers for asset responses. ([#162](https://github.com/paperclipai/paperclip/pull/162), @JonCSykes)
|
||||
- **App version label** — The sidebar now displays the running Paperclip version. ([#1096](https://github.com/paperclipai/paperclip/pull/1096), @saishankar404)
|
||||
- **Project tab caching** — Active project tab is remembered per-project; tabs have been renamed and reordered. ([#990](https://github.com/paperclipai/paperclip/pull/990))
|
||||
- **Copy-to-clipboard on issues** — Issue detail headers now include a copy button; HTML entities no longer leak into copied text. ([#990](https://github.com/paperclipai/paperclip/pull/990))
|
||||
- **Me and Unassigned assignee options** — Quick-filter assignee options for the current user and unassigned issues. ([#990](https://github.com/paperclipai/paperclip/pull/990))
|
||||
- **Skip pre-filled fields in new issue dialog** — Tab order now skips assignee and project fields when they are already populated. ([#990](https://github.com/paperclipai/paperclip/pull/990))
|
||||
- **Worktree cleanup command** — New `worktree:cleanup` command, env-var defaults, and auto-prefix for worktree branches. ([#1038](https://github.com/paperclipai/paperclip/pull/1038))
|
||||
- **Release automation** — Automated canary and stable release workflows with npm trusted publishing and provenance metadata. ([#1151](https://github.com/paperclipai/paperclip/pull/1151), [#1162](https://github.com/paperclipai/paperclip/pull/1162))
|
||||
- **Documentation link** — Sidebar documentation link now points to external docs.paperclip.ing.
|
||||
- **Onboarding starter task delay** — Starter tasks are no longer created until the user launches.
|
||||
|
||||
## Fixes
|
||||
|
||||
- **Embedded PostgreSQL hardening** — Startup adoption, data-dir verification, and UTF-8 encoding are now handled reliably. (@vkartaviy)
|
||||
- **`os.userInfo()` guard** — Containers with UID-only users no longer crash; HOME is excluded from the cache key. ([#1145](https://github.com/paperclipai/paperclip/pull/1145), @wesseljt)
|
||||
- **opencode-local HOME resolution** — `os.userInfo()` is used for model discovery instead of relying on the HOME env var. ([#1145](https://github.com/paperclipai/paperclip/pull/1145), @wesseljt)
|
||||
- **dotenv cwd fallback** — The server now loads `.env` from `cwd` when `.paperclip/.env` is missing. ([#834](https://github.com/paperclipai/paperclip/pull/834), @mvanhorn)
|
||||
- **Plugin event subscription wiring** — Fixed subscription cleanup, filter nullability, and stale diagram. ([#988](https://github.com/paperclipai/paperclip/pull/988), @leeknowsai)
|
||||
- **Plugin slot rendering** — Corrected slot registration and rendering for plugin UI extensions. ([#916](https://github.com/paperclipai/paperclip/pull/916), [#918](https://github.com/paperclipai/paperclip/pull/918), @gsxdsm)
|
||||
- **Archive project UX** — Archive now navigates to the dashboard and shows a toast; replaced `window.confirm` with inline confirmation.
|
||||
- **Markdown editor spacing** — Image drop/paste adds proper newlines; header top margins increased.
|
||||
- **Workspace form refresh** — Forms now refresh when projects are accessed via URL key and allow empty saves.
|
||||
- **Legacy migration reconciliation** — Fixed migration reconciliation for existing installations.
|
||||
- **`archivedAt` type coercion** — String-to-Date conversion before Drizzle update prevents type errors.
|
||||
- **Agent HOME env var** — `AGENT_HOME` is now set correctly for child agent processes. ([#864](https://github.com/paperclipai/paperclip/pull/864))
|
||||
- **Sidebar scrollbar hover track** — Fixed scrollbar track visibility on hover. ([#919](https://github.com/paperclipai/paperclip/pull/919))
|
||||
- **Sticky save bar on non-config tabs** — Hidden to prevent layout push.
|
||||
- **Empty goals display** — Removed "None" text from empty goals.
|
||||
- **Runs page padding** — Removed unnecessary right padding.
|
||||
- **Codex bootstrap logs** — Treated as stdout instead of stderr.
|
||||
- **Dev runner syntax** — Fixed syntax issue in plugin dev runner. ([#914](https://github.com/paperclipai/paperclip/pull/914), @gsxdsm)
|
||||
- **Process list** — Fixed process list rendering. ([#903](https://github.com/paperclipai/paperclip/pull/903), @gsxdsm)
|
||||
|
||||
## Upgrade Guide
|
||||
|
||||
Ten new database migrations (`0028`–`0037`) will run automatically on startup:
|
||||
|
||||
- **Migrations 0028–0029** add plugin framework tables.
|
||||
- **Migrations 0030–0037** extend the schema for issue documents, execution workspaces, company logos, cost tracking, and plugin enhancements.
|
||||
|
||||
All migrations are additive (new tables and columns) — no existing data is modified. Standard `paperclipai` startup will apply them automatically.
|
||||
|
||||
If you use the `.env` file, note that the server now falls back to loading `.env` from the current working directory when `.paperclip/.env` is not found.
|
||||
|
||||
## Contributors
|
||||
|
||||
Thank you to everyone who contributed to this release!
|
||||
|
||||
@gsxdsm, @JonCSykes, @leeknowsai, @mvanhorn, @residentagent, @saishankar404, @teknium1, @vkartaviy, @wesseljt
|
||||
@@ -15,9 +15,11 @@ CLI_DIR="$REPO_ROOT/cli"
|
||||
DIST_DIR="$CLI_DIR/dist"
|
||||
|
||||
skip_checks=false
|
||||
skip_typecheck=false
|
||||
for arg in "$@"; do
|
||||
case "$arg" in
|
||||
--skip-checks) skip_checks=true ;;
|
||||
--skip-typecheck) skip_typecheck=true ;;
|
||||
esac
|
||||
done
|
||||
|
||||
@@ -32,12 +34,16 @@ else
|
||||
fi
|
||||
|
||||
# ── Step 2: TypeScript type-check ──────────────────────────────────────────────
|
||||
echo " [2/5] Type-checking..."
|
||||
cd "$REPO_ROOT"
|
||||
pnpm -r typecheck
|
||||
if [ "$skip_typecheck" = false ]; then
|
||||
echo " [2/6] Type-checking..."
|
||||
cd "$REPO_ROOT"
|
||||
pnpm -r typecheck
|
||||
else
|
||||
echo " [2/6] Skipping type-check (--skip-typecheck)"
|
||||
fi
|
||||
|
||||
# ── Step 3: Bundle CLI with esbuild ────────────────────────────────────────────
|
||||
echo " [3/5] Bundling CLI with esbuild..."
|
||||
echo " [3/6] Bundling CLI with esbuild..."
|
||||
cd "$CLI_DIR"
|
||||
rm -rf dist
|
||||
|
||||
|
||||
@@ -14,12 +14,13 @@ Usage:
|
||||
./scripts/create-github-release.sh <version> [--dry-run]
|
||||
|
||||
Examples:
|
||||
./scripts/create-github-release.sh 1.2.3
|
||||
./scripts/create-github-release.sh 1.2.3 --dry-run
|
||||
./scripts/create-github-release.sh 2026.318.0
|
||||
./scripts/create-github-release.sh 2026.318.0 --dry-run
|
||||
|
||||
Notes:
|
||||
- Run this after pushing the stable release branch and tag.
|
||||
- Defaults to git remote public-gh.
|
||||
- Run this after pushing the stable tag.
|
||||
- Resolves the git remote automatically.
|
||||
- In GitHub Actions, origin is used explicitly.
|
||||
- If the release already exists, this script updates its title and notes.
|
||||
EOF
|
||||
}
|
||||
@@ -48,13 +49,15 @@ if [ -z "$version" ]; then
|
||||
fi
|
||||
|
||||
if [[ ! "$version" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
|
||||
echo "Error: version must be a stable semver like 1.2.3." >&2
|
||||
echo "Error: version must be a stable calendar version like 2026.318.0." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
tag="v$version"
|
||||
notes_file="$REPO_ROOT/releases/${tag}.md"
|
||||
PUBLISH_REMOTE="${PUBLISH_REMOTE:-public-gh}"
|
||||
if [ "${GITHUB_ACTIONS:-}" = "true" ] && [ -z "${PUBLISH_REMOTE:-}" ] && git_remote_exists origin; then
|
||||
PUBLISH_REMOTE=origin
|
||||
fi
|
||||
PUBLISH_REMOTE="$(resolve_release_remote)"
|
||||
if ! command -v gh >/dev/null 2>&1; then
|
||||
echo "Error: gh CLI is required to create GitHub releases." >&2
|
||||
|
||||
@@ -7,6 +7,8 @@ HOST_PORT="${HOST_PORT:-3131}"
|
||||
PAPERCLIPAI_VERSION="${PAPERCLIPAI_VERSION:-latest}"
|
||||
DATA_DIR="${DATA_DIR:-$REPO_ROOT/data/docker-onboard-smoke}"
|
||||
HOST_UID="${HOST_UID:-$(id -u)}"
|
||||
SMOKE_DETACH="${SMOKE_DETACH:-false}"
|
||||
SMOKE_METADATA_FILE="${SMOKE_METADATA_FILE:-}"
|
||||
PAPERCLIP_DEPLOYMENT_MODE="${PAPERCLIP_DEPLOYMENT_MODE:-authenticated}"
|
||||
PAPERCLIP_DEPLOYMENT_EXPOSURE="${PAPERCLIP_DEPLOYMENT_EXPOSURE:-private}"
|
||||
PAPERCLIP_PUBLIC_URL="${PAPERCLIP_PUBLIC_URL:-http://localhost:${HOST_PORT}}"
|
||||
@@ -18,6 +20,7 @@ CONTAINER_NAME="${IMAGE_NAME//[^a-zA-Z0-9_.-]/-}"
|
||||
LOG_PID=""
|
||||
COOKIE_JAR=""
|
||||
TMP_DIR=""
|
||||
PRESERVE_CONTAINER_ON_EXIT="false"
|
||||
|
||||
mkdir -p "$DATA_DIR"
|
||||
|
||||
@@ -25,7 +28,9 @@ cleanup() {
|
||||
if [[ -n "$LOG_PID" ]]; then
|
||||
kill "$LOG_PID" >/dev/null 2>&1 || true
|
||||
fi
|
||||
docker stop "$CONTAINER_NAME" >/dev/null 2>&1 || true
|
||||
if [[ "$PRESERVE_CONTAINER_ON_EXIT" != "true" ]]; then
|
||||
docker stop "$CONTAINER_NAME" >/dev/null 2>&1 || true
|
||||
fi
|
||||
if [[ -n "$TMP_DIR" && -d "$TMP_DIR" ]]; then
|
||||
rm -rf "$TMP_DIR"
|
||||
fi
|
||||
@@ -33,6 +38,12 @@ cleanup() {
|
||||
|
||||
trap cleanup EXIT INT TERM
|
||||
|
||||
container_is_running() {
|
||||
local running
|
||||
running="$(docker inspect -f '{{.State.Running}}' "$CONTAINER_NAME" 2>/dev/null || true)"
|
||||
[[ "$running" == "true" ]]
|
||||
}
|
||||
|
||||
wait_for_http() {
|
||||
local url="$1"
|
||||
local attempts="${2:-60}"
|
||||
@@ -42,11 +53,36 @@ wait_for_http() {
|
||||
if curl -fsS "$url" >/dev/null 2>&1; then
|
||||
return 0
|
||||
fi
|
||||
if ! container_is_running; then
|
||||
echo "Smoke bootstrap failed: container $CONTAINER_NAME exited before $url became ready" >&2
|
||||
docker logs "$CONTAINER_NAME" >&2 || true
|
||||
return 1
|
||||
fi
|
||||
sleep "$sleep_seconds"
|
||||
done
|
||||
if ! container_is_running; then
|
||||
echo "Smoke bootstrap failed: container $CONTAINER_NAME exited before readiness check completed" >&2
|
||||
docker logs "$CONTAINER_NAME" >&2 || true
|
||||
fi
|
||||
return 1
|
||||
}
|
||||
|
||||
write_metadata_file() {
|
||||
if [[ -z "$SMOKE_METADATA_FILE" ]]; then
|
||||
return 0
|
||||
fi
|
||||
mkdir -p "$(dirname "$SMOKE_METADATA_FILE")"
|
||||
{
|
||||
printf 'SMOKE_BASE_URL=%q\n' "$PAPERCLIP_PUBLIC_URL"
|
||||
printf 'SMOKE_ADMIN_EMAIL=%q\n' "$SMOKE_ADMIN_EMAIL"
|
||||
printf 'SMOKE_ADMIN_PASSWORD=%q\n' "$SMOKE_ADMIN_PASSWORD"
|
||||
printf 'SMOKE_CONTAINER_NAME=%q\n' "$CONTAINER_NAME"
|
||||
printf 'SMOKE_DATA_DIR=%q\n' "$DATA_DIR"
|
||||
printf 'SMOKE_IMAGE_NAME=%q\n' "$IMAGE_NAME"
|
||||
printf 'SMOKE_PAPERCLIPAI_VERSION=%q\n' "$PAPERCLIPAI_VERSION"
|
||||
} >"$SMOKE_METADATA_FILE"
|
||||
}
|
||||
|
||||
generate_bootstrap_invite_url() {
|
||||
local bootstrap_output
|
||||
local bootstrap_status
|
||||
@@ -214,9 +250,12 @@ echo "==> Running onboard smoke container"
|
||||
echo " UI should be reachable at: http://localhost:$HOST_PORT"
|
||||
echo " Public URL: $PAPERCLIP_PUBLIC_URL"
|
||||
echo " Smoke auto-bootstrap: $SMOKE_AUTO_BOOTSTRAP"
|
||||
echo " Detached mode: $SMOKE_DETACH"
|
||||
echo " Data dir: $DATA_DIR"
|
||||
echo " Deployment: $PAPERCLIP_DEPLOYMENT_MODE/$PAPERCLIP_DEPLOYMENT_EXPOSURE"
|
||||
echo " Live output: onboard banner and server logs stream in this terminal (Ctrl+C to stop)"
|
||||
if [[ "$SMOKE_DETACH" != "true" ]]; then
|
||||
echo " Live output: onboard banner and server logs stream in this terminal (Ctrl+C to stop)"
|
||||
fi
|
||||
|
||||
docker rm -f "$CONTAINER_NAME" >/dev/null 2>&1 || true
|
||||
|
||||
@@ -231,8 +270,10 @@ docker run -d --rm \
|
||||
-v "$DATA_DIR:/paperclip" \
|
||||
"$IMAGE_NAME" >/dev/null
|
||||
|
||||
docker logs -f "$CONTAINER_NAME" &
|
||||
LOG_PID=$!
|
||||
if [[ "$SMOKE_DETACH" != "true" ]]; then
|
||||
docker logs -f "$CONTAINER_NAME" &
|
||||
LOG_PID=$!
|
||||
fi
|
||||
|
||||
TMP_DIR="$(mktemp -d "${TMPDIR:-/tmp}/paperclip-onboard-smoke.XXXXXX")"
|
||||
COOKIE_JAR="$TMP_DIR/cookies.txt"
|
||||
@@ -246,4 +287,17 @@ if [[ "$SMOKE_AUTO_BOOTSTRAP" == "true" && "$PAPERCLIP_DEPLOYMENT_MODE" == "auth
|
||||
auto_bootstrap_authenticated_smoke
|
||||
fi
|
||||
|
||||
write_metadata_file
|
||||
|
||||
if [[ "$SMOKE_DETACH" == "true" ]]; then
|
||||
PRESERVE_CONTAINER_ON_EXIT="true"
|
||||
echo "==> Smoke container ready for automation"
|
||||
echo " Smoke base URL: $PAPERCLIP_PUBLIC_URL"
|
||||
echo " Smoke admin credentials: $SMOKE_ADMIN_EMAIL / $SMOKE_ADMIN_PASSWORD"
|
||||
if [[ -n "$SMOKE_METADATA_FILE" ]]; then
|
||||
echo " Smoke metadata file: $SMOKE_METADATA_FILE"
|
||||
fi
|
||||
exit 0
|
||||
fi
|
||||
|
||||
wait "$LOG_PID"
|
||||
|
||||
@@ -37,7 +37,7 @@ const workspacePaths = [
|
||||
];
|
||||
|
||||
// Workspace packages that are NOT bundled and must stay as npm dependencies.
|
||||
// These get published separately via Changesets and resolved at runtime.
|
||||
// These get published separately and resolved at runtime.
|
||||
const externalWorkspacePackages = new Set([
|
||||
"@paperclipai/server",
|
||||
]);
|
||||
@@ -57,7 +57,7 @@ for (const pkgPath of workspacePaths) {
|
||||
if (externalWorkspacePackages.has(name)) {
|
||||
const pkgDirMap = { "@paperclipai/server": "server" };
|
||||
const wsPkg = readPkg(pkgDirMap[name]);
|
||||
allDeps[name] = `^${wsPkg.version}`;
|
||||
allDeps[name] = wsPkg.version;
|
||||
continue;
|
||||
}
|
||||
// Keep the more specific (pinned) version if conflict
|
||||
@@ -94,6 +94,7 @@ const publishPkg = {
|
||||
license: cliPkg.license,
|
||||
repository: cliPkg.repository,
|
||||
homepage: cliPkg.homepage,
|
||||
bugs: cliPkg.bugs,
|
||||
files: cliPkg.files,
|
||||
engines: { node: ">=20" },
|
||||
dependencies: sortedDeps,
|
||||
|
||||
@@ -64,6 +64,11 @@ resolve_release_remote() {
|
||||
return
|
||||
fi
|
||||
|
||||
if git_remote_exists public; then
|
||||
printf 'public\n'
|
||||
return
|
||||
fi
|
||||
|
||||
if git_remote_exists origin; then
|
||||
printf 'origin\n'
|
||||
return
|
||||
@@ -76,6 +81,18 @@ fetch_release_remote() {
|
||||
git -C "$REPO_ROOT" fetch "$1" --prune --tags
|
||||
}
|
||||
|
||||
git_current_branch() {
|
||||
git -C "$REPO_ROOT" symbolic-ref --quiet --short HEAD 2>/dev/null || true
|
||||
}
|
||||
|
||||
git_local_tag_exists() {
|
||||
git -C "$REPO_ROOT" show-ref --verify --quiet "refs/tags/$1"
|
||||
}
|
||||
|
||||
git_remote_tag_exists() {
|
||||
git -C "$REPO_ROOT" ls-remote --exit-code --tags "$2" "refs/tags/$1" >/dev/null 2>&1
|
||||
}
|
||||
|
||||
get_last_stable_tag() {
|
||||
git -C "$REPO_ROOT" tag --list 'v*' --sort=-version:refname | head -1
|
||||
}
|
||||
@@ -90,110 +107,130 @@ get_current_stable_version() {
|
||||
fi
|
||||
}
|
||||
|
||||
compute_bumped_version() {
|
||||
node - "$1" "$2" <<'NODE'
|
||||
const current = process.argv[2];
|
||||
const bump = process.argv[3];
|
||||
const match = current.match(/^(\d+)\.(\d+)\.(\d+)$/);
|
||||
stable_version_slot_for_date() {
|
||||
node - "${1:-}" <<'NODE'
|
||||
const input = process.argv[2];
|
||||
|
||||
if (!match) {
|
||||
throw new Error(`invalid semver version: ${current}`);
|
||||
const date = input ? new Date(`${input}T00:00:00Z`) : new Date();
|
||||
if (Number.isNaN(date.getTime())) {
|
||||
console.error(`invalid date: ${input}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
let [major, minor, patch] = match.slice(1).map(Number);
|
||||
const month = String(date.getUTCMonth() + 1);
|
||||
const day = String(date.getUTCDate()).padStart(2, '0');
|
||||
|
||||
if (bump === 'patch') {
|
||||
patch += 1;
|
||||
} else if (bump === 'minor') {
|
||||
minor += 1;
|
||||
patch = 0;
|
||||
} else if (bump === 'major') {
|
||||
major += 1;
|
||||
minor = 0;
|
||||
patch = 0;
|
||||
} else {
|
||||
throw new Error(`unsupported bump type: ${bump}`);
|
||||
process.stdout.write(`${date.getUTCFullYear()}.${month}${day}`);
|
||||
NODE
|
||||
}
|
||||
|
||||
process.stdout.write(`${major}.${minor}.${patch}`);
|
||||
utc_date_iso() {
|
||||
node <<'NODE'
|
||||
const date = new Date();
|
||||
const y = date.getUTCFullYear();
|
||||
const m = String(date.getUTCMonth() + 1).padStart(2, '0');
|
||||
const d = String(date.getUTCDate()).padStart(2, '0');
|
||||
process.stdout.write(`${y}-${m}-${d}`);
|
||||
NODE
|
||||
}
|
||||
|
||||
next_stable_version() {
|
||||
local release_date="$1"
|
||||
shift
|
||||
|
||||
node - "$release_date" "$@" <<'NODE'
|
||||
const input = process.argv[2];
|
||||
const packageNames = process.argv.slice(3);
|
||||
const { execSync } = require("node:child_process");
|
||||
|
||||
const date = input ? new Date(`${input}T00:00:00Z`) : new Date();
|
||||
if (Number.isNaN(date.getTime())) {
|
||||
console.error(`invalid date: ${input}`);
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const stableSlot = `${date.getUTCFullYear()}.${date.getUTCMonth() + 1}${String(date.getUTCDate()).padStart(2, "0")}`;
|
||||
const pattern = new RegExp(`^${stableSlot.replace(/\./g, '\\.')}\.(\\d+)$`);
|
||||
let max = -1;
|
||||
|
||||
for (const packageName of packageNames) {
|
||||
let versions = [];
|
||||
|
||||
try {
|
||||
const raw = execSync(`npm view ${JSON.stringify(packageName)} versions --json`, {
|
||||
encoding: "utf8",
|
||||
stdio: ["ignore", "pipe", "ignore"],
|
||||
}).trim();
|
||||
|
||||
if (raw) {
|
||||
const parsed = JSON.parse(raw);
|
||||
versions = Array.isArray(parsed) ? parsed : [parsed];
|
||||
}
|
||||
} catch {
|
||||
versions = [];
|
||||
}
|
||||
|
||||
for (const version of versions) {
|
||||
const match = version.match(pattern);
|
||||
if (!match) continue;
|
||||
max = Math.max(max, Number(match[1]));
|
||||
}
|
||||
}
|
||||
|
||||
process.stdout.write(`${stableSlot}.${max + 1}`);
|
||||
NODE
|
||||
}
|
||||
|
||||
next_canary_version() {
|
||||
local stable_version="$1"
|
||||
local versions_json
|
||||
shift
|
||||
|
||||
versions_json="$(npm view paperclipai versions --json 2>/dev/null || echo '[]')"
|
||||
|
||||
node - "$stable_version" "$versions_json" <<'NODE'
|
||||
node - "$stable_version" "$@" <<'NODE'
|
||||
const stable = process.argv[2];
|
||||
const versionsArg = process.argv[3];
|
||||
|
||||
let versions = [];
|
||||
try {
|
||||
const parsed = JSON.parse(versionsArg);
|
||||
versions = Array.isArray(parsed) ? parsed : [parsed];
|
||||
} catch {
|
||||
versions = [];
|
||||
}
|
||||
const packageNames = process.argv.slice(3);
|
||||
const { execSync } = require("node:child_process");
|
||||
|
||||
const pattern = new RegExp(`^${stable.replace(/\./g, '\\.')}-canary\\.(\\d+)$`);
|
||||
let max = -1;
|
||||
|
||||
for (const version of versions) {
|
||||
const match = version.match(pattern);
|
||||
if (!match) continue;
|
||||
max = Math.max(max, Number(match[1]));
|
||||
for (const packageName of packageNames) {
|
||||
let versions = [];
|
||||
|
||||
try {
|
||||
const raw = execSync(`npm view ${JSON.stringify(packageName)} versions --json`, {
|
||||
encoding: "utf8",
|
||||
stdio: ["ignore", "pipe", "ignore"],
|
||||
}).trim();
|
||||
|
||||
if (raw) {
|
||||
const parsed = JSON.parse(raw);
|
||||
versions = Array.isArray(parsed) ? parsed : [parsed];
|
||||
}
|
||||
} catch {
|
||||
versions = [];
|
||||
}
|
||||
|
||||
for (const version of versions) {
|
||||
const match = version.match(pattern);
|
||||
if (!match) continue;
|
||||
max = Math.max(max, Number(match[1]));
|
||||
}
|
||||
}
|
||||
|
||||
process.stdout.write(`${stable}-canary.${max + 1}`);
|
||||
NODE
|
||||
}
|
||||
|
||||
release_branch_name() {
|
||||
printf 'release/%s\n' "$1"
|
||||
}
|
||||
|
||||
release_notes_file() {
|
||||
printf '%s/releases/v%s.md\n' "$REPO_ROOT" "$1"
|
||||
}
|
||||
|
||||
default_release_worktree_path() {
|
||||
local version="$1"
|
||||
local parent_dir
|
||||
local repo_name
|
||||
|
||||
parent_dir="$(cd "$REPO_ROOT/.." && pwd)"
|
||||
repo_name="$(basename "$REPO_ROOT")"
|
||||
printf '%s/%s-release-%s\n' "$parent_dir" "$repo_name" "$version"
|
||||
stable_tag_name() {
|
||||
printf 'v%s\n' "$1"
|
||||
}
|
||||
|
||||
git_current_branch() {
|
||||
git -C "$REPO_ROOT" symbolic-ref --quiet --short HEAD 2>/dev/null || true
|
||||
}
|
||||
|
||||
git_local_branch_exists() {
|
||||
git -C "$REPO_ROOT" show-ref --verify --quiet "refs/heads/$1"
|
||||
}
|
||||
|
||||
git_remote_branch_exists() {
|
||||
git -C "$REPO_ROOT" ls-remote --exit-code --heads "$2" "refs/heads/$1" >/dev/null 2>&1
|
||||
}
|
||||
|
||||
git_local_tag_exists() {
|
||||
git -C "$REPO_ROOT" show-ref --verify --quiet "refs/tags/$1"
|
||||
}
|
||||
|
||||
git_remote_tag_exists() {
|
||||
git -C "$REPO_ROOT" ls-remote --exit-code --tags "$2" "refs/tags/$1" >/dev/null 2>&1
|
||||
}
|
||||
|
||||
npm_version_exists() {
|
||||
local version="$1"
|
||||
local resolved
|
||||
|
||||
resolved="$(npm view "paperclipai@${version}" version 2>/dev/null || true)"
|
||||
[ "$resolved" = "$version" ]
|
||||
canary_tag_name() {
|
||||
printf 'canary/v%s\n' "$1"
|
||||
}
|
||||
|
||||
npm_package_version_exists() {
|
||||
@@ -232,50 +269,38 @@ require_clean_worktree() {
|
||||
fi
|
||||
}
|
||||
|
||||
git_worktree_path_for_branch() {
|
||||
local branch_ref="refs/heads/$1"
|
||||
|
||||
git -C "$REPO_ROOT" worktree list --porcelain | awk -v branch_ref="$branch_ref" '
|
||||
$1 == "worktree" { path = substr($0, 10) }
|
||||
$1 == "branch" && $2 == branch_ref { print path; exit }
|
||||
'
|
||||
}
|
||||
|
||||
path_is_worktree_for_branch() {
|
||||
local path="$1"
|
||||
local branch="$2"
|
||||
require_on_master_branch() {
|
||||
local current_branch
|
||||
|
||||
[ -d "$path" ] || return 1
|
||||
current_branch="$(git -C "$path" symbolic-ref --quiet --short HEAD 2>/dev/null || true)"
|
||||
[ "$current_branch" = "$branch" ]
|
||||
}
|
||||
|
||||
ensure_release_branch_for_version() {
|
||||
local stable_version="$1"
|
||||
local current_branch
|
||||
local expected_branch
|
||||
|
||||
current_branch="$(git_current_branch)"
|
||||
expected_branch="$(release_branch_name "$stable_version")"
|
||||
|
||||
if [ -z "$current_branch" ]; then
|
||||
release_fail "release work must run from branch $expected_branch, but HEAD is detached."
|
||||
fi
|
||||
|
||||
if [ "$current_branch" != "$expected_branch" ]; then
|
||||
release_fail "release work must run from branch $expected_branch, but current branch is $current_branch."
|
||||
if [ "$current_branch" != "master" ]; then
|
||||
release_fail "this release step must run from branch master, but current branch is ${current_branch:-<detached>}."
|
||||
fi
|
||||
}
|
||||
|
||||
stable_release_exists_anywhere() {
|
||||
local stable_version="$1"
|
||||
local remote="$2"
|
||||
local tag="v$stable_version"
|
||||
require_npm_publish_auth() {
|
||||
local dry_run="$1"
|
||||
|
||||
git_local_tag_exists "$tag" || git_remote_tag_exists "$tag" "$remote" || npm_version_exists "$stable_version"
|
||||
if [ "$dry_run" = true ]; then
|
||||
return
|
||||
fi
|
||||
|
||||
if npm whoami >/dev/null 2>&1; then
|
||||
release_info " ✓ Logged in to npm as $(npm whoami)"
|
||||
return
|
||||
fi
|
||||
|
||||
if [ "${GITHUB_ACTIONS:-}" = "true" ]; then
|
||||
release_info " ✓ npm publish auth will be provided by GitHub Actions trusted publishing"
|
||||
return
|
||||
fi
|
||||
|
||||
release_fail "npm publish auth is not available. Use 'npm login' locally or run from GitHub Actions with trusted publishing."
|
||||
}
|
||||
|
||||
release_train_is_frozen() {
|
||||
stable_release_exists_anywhere "$1" "$2"
|
||||
list_public_package_info() {
|
||||
node "$REPO_ROOT/scripts/release-package-map.mjs" list
|
||||
}
|
||||
|
||||
set_public_package_version() {
|
||||
node "$REPO_ROOT/scripts/release-package-map.mjs" set-version "$1"
|
||||
}
|
||||
|
||||
168
scripts/release-package-map.mjs
Normal file
168
scripts/release-package-map.mjs
Normal file
@@ -0,0 +1,168 @@
|
||||
#!/usr/bin/env node
|
||||
|
||||
import { readdirSync, readFileSync, writeFileSync, existsSync } from "node:fs";
|
||||
import { fileURLToPath } from "node:url";
|
||||
import { dirname, join, resolve } from "node:path";
|
||||
|
||||
const __dirname = dirname(fileURLToPath(import.meta.url));
|
||||
const repoRoot = resolve(__dirname, "..");
|
||||
const roots = ["packages", "server", "ui", "cli"];
|
||||
|
||||
function readJson(filePath) {
|
||||
return JSON.parse(readFileSync(filePath, "utf8"));
|
||||
}
|
||||
|
||||
function discoverPublicPackages() {
|
||||
const packages = [];
|
||||
|
||||
function walk(relDir) {
|
||||
const absDir = join(repoRoot, relDir);
|
||||
if (!existsSync(absDir)) return;
|
||||
|
||||
const pkgPath = join(absDir, "package.json");
|
||||
if (existsSync(pkgPath)) {
|
||||
const pkg = readJson(pkgPath);
|
||||
if (!pkg.private) {
|
||||
packages.push({
|
||||
dir: relDir,
|
||||
pkgPath,
|
||||
name: pkg.name,
|
||||
version: pkg.version,
|
||||
pkg,
|
||||
});
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
for (const entry of readdirSync(absDir, { withFileTypes: true })) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
if (entry.name === "node_modules" || entry.name === "dist" || entry.name === ".git") continue;
|
||||
walk(join(relDir, entry.name));
|
||||
}
|
||||
}
|
||||
|
||||
for (const rel of roots) {
|
||||
walk(rel);
|
||||
}
|
||||
|
||||
return packages;
|
||||
}
|
||||
|
||||
function sortTopologically(packages) {
|
||||
const byName = new Map(packages.map((pkg) => [pkg.name, pkg]));
|
||||
const visited = new Set();
|
||||
const visiting = new Set();
|
||||
const ordered = [];
|
||||
|
||||
function visit(pkg) {
|
||||
if (visited.has(pkg.name)) return;
|
||||
if (visiting.has(pkg.name)) {
|
||||
throw new Error(`cycle detected in public package graph at ${pkg.name}`);
|
||||
}
|
||||
|
||||
visiting.add(pkg.name);
|
||||
|
||||
const dependencySections = [
|
||||
pkg.pkg.dependencies ?? {},
|
||||
pkg.pkg.optionalDependencies ?? {},
|
||||
pkg.pkg.peerDependencies ?? {},
|
||||
];
|
||||
|
||||
for (const deps of dependencySections) {
|
||||
for (const depName of Object.keys(deps)) {
|
||||
const dep = byName.get(depName);
|
||||
if (dep) visit(dep);
|
||||
}
|
||||
}
|
||||
|
||||
visiting.delete(pkg.name);
|
||||
visited.add(pkg.name);
|
||||
ordered.push(pkg);
|
||||
}
|
||||
|
||||
for (const pkg of [...packages].sort((a, b) => a.dir.localeCompare(b.dir))) {
|
||||
visit(pkg);
|
||||
}
|
||||
|
||||
return ordered;
|
||||
}
|
||||
|
||||
function replaceWorkspaceDeps(deps, version) {
|
||||
if (!deps) return deps;
|
||||
const next = { ...deps };
|
||||
|
||||
for (const [name, value] of Object.entries(next)) {
|
||||
if (!name.startsWith("@paperclipai/")) continue;
|
||||
if (typeof value !== "string" || !value.startsWith("workspace:")) continue;
|
||||
next[name] = version;
|
||||
}
|
||||
|
||||
return next;
|
||||
}
|
||||
|
||||
function setVersion(version) {
|
||||
const packages = sortTopologically(discoverPublicPackages());
|
||||
|
||||
for (const pkg of packages) {
|
||||
const nextPkg = {
|
||||
...pkg.pkg,
|
||||
version,
|
||||
dependencies: replaceWorkspaceDeps(pkg.pkg.dependencies, version),
|
||||
optionalDependencies: replaceWorkspaceDeps(pkg.pkg.optionalDependencies, version),
|
||||
peerDependencies: replaceWorkspaceDeps(pkg.pkg.peerDependencies, version),
|
||||
devDependencies: replaceWorkspaceDeps(pkg.pkg.devDependencies, version),
|
||||
};
|
||||
|
||||
writeFileSync(pkg.pkgPath, `${JSON.stringify(nextPkg, null, 2)}\n`);
|
||||
}
|
||||
|
||||
const cliEntryPath = join(repoRoot, "cli/src/index.ts");
|
||||
const cliEntry = readFileSync(cliEntryPath, "utf8");
|
||||
const nextCliEntry = cliEntry.replace(
|
||||
/\.version\("([^"]+)"\)/,
|
||||
`.version("${version}")`,
|
||||
);
|
||||
|
||||
if (cliEntry === nextCliEntry) {
|
||||
throw new Error("failed to rewrite CLI version string in cli/src/index.ts");
|
||||
}
|
||||
|
||||
writeFileSync(cliEntryPath, nextCliEntry);
|
||||
}
|
||||
|
||||
function listPackages() {
|
||||
const packages = sortTopologically(discoverPublicPackages());
|
||||
for (const pkg of packages) {
|
||||
process.stdout.write(`${pkg.dir}\t${pkg.name}\t${pkg.version}\n`);
|
||||
}
|
||||
}
|
||||
|
||||
function usage() {
|
||||
process.stderr.write(
|
||||
[
|
||||
"Usage:",
|
||||
" node scripts/release-package-map.mjs list",
|
||||
" node scripts/release-package-map.mjs set-version <version>",
|
||||
"",
|
||||
].join("\n"),
|
||||
);
|
||||
}
|
||||
|
||||
const [command, arg] = process.argv.slice(2);
|
||||
|
||||
if (command === "list") {
|
||||
listPackages();
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
if (command === "set-version") {
|
||||
if (!arg) {
|
||||
usage();
|
||||
process.exit(1);
|
||||
}
|
||||
setVersion(arg);
|
||||
process.exit(0);
|
||||
}
|
||||
|
||||
usage();
|
||||
process.exit(1);
|
||||
@@ -1,201 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
# shellcheck source=./release-lib.sh
|
||||
. "$REPO_ROOT/scripts/release-lib.sh"
|
||||
export GIT_PAGER=cat
|
||||
|
||||
channel=""
|
||||
bump_type=""
|
||||
|
||||
usage() {
|
||||
cat <<'EOF'
|
||||
Usage:
|
||||
./scripts/release-preflight.sh <canary|stable> <patch|minor|major>
|
||||
|
||||
Examples:
|
||||
./scripts/release-preflight.sh canary patch
|
||||
./scripts/release-preflight.sh stable minor
|
||||
|
||||
What it does:
|
||||
- verifies the git worktree is clean, including untracked files
|
||||
- verifies you are on the matching release/X.Y.Z branch
|
||||
- shows the last stable tag and the target version(s)
|
||||
- shows the git/npm/GitHub release-train state
|
||||
- shows commits since the last stable tag
|
||||
- highlights migration/schema/breaking-change signals
|
||||
- runs the verification gate:
|
||||
pnpm -r typecheck
|
||||
pnpm test:run
|
||||
pnpm build
|
||||
EOF
|
||||
}
|
||||
|
||||
while [ $# -gt 0 ]; do
|
||||
case "$1" in
|
||||
-h|--help)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
if [ -z "$channel" ]; then
|
||||
channel="$1"
|
||||
elif [ -z "$bump_type" ]; then
|
||||
bump_type="$1"
|
||||
else
|
||||
echo "Error: unexpected argument: $1" >&2
|
||||
exit 1
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
if [ -z "$channel" ] || [ -z "$bump_type" ]; then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ! "$channel" =~ ^(canary|stable)$ ]]; then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ! "$bump_type" =~ ^(patch|minor|major)$ ]]; then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
RELEASE_REMOTE="$(resolve_release_remote)"
|
||||
fetch_release_remote "$RELEASE_REMOTE"
|
||||
|
||||
LAST_STABLE_TAG="$(get_last_stable_tag)"
|
||||
CURRENT_STABLE_VERSION="$(get_current_stable_version)"
|
||||
TARGET_STABLE_VERSION="$(compute_bumped_version "$CURRENT_STABLE_VERSION" "$bump_type")"
|
||||
TARGET_CANARY_VERSION="$(next_canary_version "$TARGET_STABLE_VERSION")"
|
||||
EXPECTED_RELEASE_BRANCH="$(release_branch_name "$TARGET_STABLE_VERSION")"
|
||||
CURRENT_BRANCH="$(git_current_branch)"
|
||||
RELEASE_TAG="v$TARGET_STABLE_VERSION"
|
||||
NOTES_FILE="$(release_notes_file "$TARGET_STABLE_VERSION")"
|
||||
|
||||
require_clean_worktree
|
||||
|
||||
if [ "$TARGET_STABLE_VERSION" = "$CURRENT_STABLE_VERSION" ]; then
|
||||
echo "Error: next stable version matches the current stable version." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ "$TARGET_CANARY_VERSION" == "${CURRENT_STABLE_VERSION}-canary."* ]]; then
|
||||
echo "Error: canary target was derived from the current stable version, which is not allowed." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
ensure_release_branch_for_version "$TARGET_STABLE_VERSION"
|
||||
|
||||
REMOTE_BRANCH_EXISTS="no"
|
||||
REMOTE_TAG_EXISTS="no"
|
||||
LOCAL_TAG_EXISTS="no"
|
||||
NPM_STABLE_EXISTS="no"
|
||||
|
||||
if git_remote_branch_exists "$EXPECTED_RELEASE_BRANCH" "$RELEASE_REMOTE"; then
|
||||
REMOTE_BRANCH_EXISTS="yes"
|
||||
fi
|
||||
|
||||
if git_local_tag_exists "$RELEASE_TAG"; then
|
||||
LOCAL_TAG_EXISTS="yes"
|
||||
fi
|
||||
|
||||
if git_remote_tag_exists "$RELEASE_TAG" "$RELEASE_REMOTE"; then
|
||||
REMOTE_TAG_EXISTS="yes"
|
||||
fi
|
||||
|
||||
if npm_version_exists "$TARGET_STABLE_VERSION"; then
|
||||
NPM_STABLE_EXISTS="yes"
|
||||
fi
|
||||
|
||||
if [ "$LOCAL_TAG_EXISTS" = "yes" ] || [ "$REMOTE_TAG_EXISTS" = "yes" ] || [ "$NPM_STABLE_EXISTS" = "yes" ]; then
|
||||
echo "Error: release train $EXPECTED_RELEASE_BRANCH is frozen because $RELEASE_TAG already exists locally, remotely, or version $TARGET_STABLE_VERSION is already on npm." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "==> Release preflight"
|
||||
echo " Remote: $RELEASE_REMOTE"
|
||||
echo " Channel: $channel"
|
||||
echo " Bump: $bump_type"
|
||||
echo " Current branch: ${CURRENT_BRANCH:-<detached>}"
|
||||
echo " Expected branch: $EXPECTED_RELEASE_BRANCH"
|
||||
echo " Last stable tag: ${LAST_STABLE_TAG:-<none>}"
|
||||
echo " Current stable version: $CURRENT_STABLE_VERSION"
|
||||
echo " Next stable version: $TARGET_STABLE_VERSION"
|
||||
if [ "$channel" = "canary" ]; then
|
||||
echo " Next canary version: $TARGET_CANARY_VERSION"
|
||||
echo " Guard: canaries are always derived from the next stable version, never ${CURRENT_STABLE_VERSION}-canary.N"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "==> Working tree"
|
||||
echo " ✓ Clean"
|
||||
echo " ✓ Branch matches release train"
|
||||
|
||||
echo ""
|
||||
echo "==> Release train state"
|
||||
echo " Remote branch exists: $REMOTE_BRANCH_EXISTS"
|
||||
echo " Local stable tag exists: $LOCAL_TAG_EXISTS"
|
||||
echo " Remote stable tag exists: $REMOTE_TAG_EXISTS"
|
||||
echo " Stable version on npm: $NPM_STABLE_EXISTS"
|
||||
if [ -f "$NOTES_FILE" ]; then
|
||||
echo " Release notes: present at $NOTES_FILE"
|
||||
else
|
||||
echo " Release notes: missing at $NOTES_FILE"
|
||||
fi
|
||||
|
||||
if [ "$REMOTE_BRANCH_EXISTS" = "no" ]; then
|
||||
echo " Warning: remote branch $EXPECTED_RELEASE_BRANCH does not exist on $RELEASE_REMOTE yet."
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "==> Commits since last stable tag"
|
||||
if [ -n "$LAST_STABLE_TAG" ]; then
|
||||
git -C "$REPO_ROOT" --no-pager log "${LAST_STABLE_TAG}..HEAD" --oneline --no-merges || true
|
||||
else
|
||||
git -C "$REPO_ROOT" --no-pager log --oneline --no-merges || true
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "==> Migration / breaking change signals"
|
||||
if [ -n "$LAST_STABLE_TAG" ]; then
|
||||
echo "-- migrations --"
|
||||
git -C "$REPO_ROOT" --no-pager diff --name-only "${LAST_STABLE_TAG}..HEAD" -- packages/db/src/migrations/ || true
|
||||
echo "-- schema --"
|
||||
git -C "$REPO_ROOT" --no-pager diff "${LAST_STABLE_TAG}..HEAD" -- packages/db/src/schema/ || true
|
||||
echo "-- breaking commit messages --"
|
||||
git -C "$REPO_ROOT" --no-pager log "${LAST_STABLE_TAG}..HEAD" --format="%s" | grep -E 'BREAKING CHANGE|BREAKING:|^[a-z]+!:' || true
|
||||
else
|
||||
echo "No stable tag exists yet. Review the full current tree manually."
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "==> Verification gate"
|
||||
cd "$REPO_ROOT"
|
||||
pnpm -r typecheck
|
||||
pnpm test:run
|
||||
pnpm build
|
||||
|
||||
echo ""
|
||||
echo "==> Release preflight summary"
|
||||
echo " Remote: $RELEASE_REMOTE"
|
||||
echo " Channel: $channel"
|
||||
echo " Bump: $bump_type"
|
||||
echo " Release branch: $EXPECTED_RELEASE_BRANCH"
|
||||
echo " Last stable tag: ${LAST_STABLE_TAG:-<none>}"
|
||||
echo " Current stable version: $CURRENT_STABLE_VERSION"
|
||||
echo " Next stable version: $TARGET_STABLE_VERSION"
|
||||
if [ "$channel" = "canary" ]; then
|
||||
echo " Next canary version: $TARGET_CANARY_VERSION"
|
||||
echo " Guard: canaries are always derived from the next stable version, never ${CURRENT_STABLE_VERSION}-canary.N"
|
||||
fi
|
||||
|
||||
echo ""
|
||||
echo "Preflight passed for $channel release."
|
||||
@@ -1,182 +0,0 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
# shellcheck source=./release-lib.sh
|
||||
. "$REPO_ROOT/scripts/release-lib.sh"
|
||||
|
||||
dry_run=false
|
||||
push_branch=true
|
||||
bump_type=""
|
||||
worktree_path=""
|
||||
|
||||
usage() {
|
||||
cat <<'EOF'
|
||||
Usage:
|
||||
./scripts/release-start.sh <patch|minor|major> [--dry-run] [--no-push] [--worktree-dir PATH]
|
||||
|
||||
Examples:
|
||||
./scripts/release-start.sh patch
|
||||
./scripts/release-start.sh minor --dry-run
|
||||
./scripts/release-start.sh major --worktree-dir ../paperclip-release-1.0.0
|
||||
|
||||
What it does:
|
||||
- fetches the release remote and tags
|
||||
- computes the next stable version from the latest stable tag
|
||||
- creates or resumes branch release/X.Y.Z
|
||||
- creates or resumes a dedicated worktree for that branch
|
||||
- pushes the release branch to the remote by default
|
||||
|
||||
Notes:
|
||||
- Stable publishes freeze a release train. If vX.Y.Z already exists locally,
|
||||
remotely, or on npm, this script refuses to reuse release/X.Y.Z.
|
||||
- Use --no-push only if you intentionally do not want the release branch on
|
||||
GitHub yet.
|
||||
EOF
|
||||
}
|
||||
|
||||
while [ $# -gt 0 ]; do
|
||||
case "$1" in
|
||||
--dry-run) dry_run=true ;;
|
||||
--no-push) push_branch=false ;;
|
||||
--worktree-dir)
|
||||
shift
|
||||
[ $# -gt 0 ] || release_fail "--worktree-dir requires a path."
|
||||
worktree_path="$1"
|
||||
;;
|
||||
-h|--help)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
if [ -n "$bump_type" ]; then
|
||||
release_fail "only one bump type may be provided."
|
||||
fi
|
||||
bump_type="$1"
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
if [[ ! "$bump_type" =~ ^(patch|minor|major)$ ]]; then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
release_remote="$(resolve_release_remote)"
|
||||
fetch_release_remote "$release_remote"
|
||||
|
||||
last_stable_tag="$(get_last_stable_tag)"
|
||||
current_stable_version="$(get_current_stable_version)"
|
||||
target_stable_version="$(compute_bumped_version "$current_stable_version" "$bump_type")"
|
||||
target_canary_version="$(next_canary_version "$target_stable_version")"
|
||||
release_branch="$(release_branch_name "$target_stable_version")"
|
||||
release_tag="v$target_stable_version"
|
||||
|
||||
if [ -z "$worktree_path" ]; then
|
||||
worktree_path="$(default_release_worktree_path "$target_stable_version")"
|
||||
fi
|
||||
|
||||
if stable_release_exists_anywhere "$target_stable_version" "$release_remote"; then
|
||||
release_fail "release train $release_branch is frozen because $release_tag already exists locally, remotely, or version $target_stable_version is already on npm."
|
||||
fi
|
||||
|
||||
branch_exists_local=false
|
||||
branch_exists_remote=false
|
||||
branch_worktree_path=""
|
||||
created_worktree=false
|
||||
created_branch=false
|
||||
pushed_branch=false
|
||||
|
||||
if git_local_branch_exists "$release_branch"; then
|
||||
branch_exists_local=true
|
||||
fi
|
||||
|
||||
if git_remote_branch_exists "$release_branch" "$release_remote"; then
|
||||
branch_exists_remote=true
|
||||
fi
|
||||
|
||||
branch_worktree_path="$(git_worktree_path_for_branch "$release_branch")"
|
||||
if [ -n "$branch_worktree_path" ]; then
|
||||
worktree_path="$branch_worktree_path"
|
||||
fi
|
||||
|
||||
if [ -e "$worktree_path" ] && ! path_is_worktree_for_branch "$worktree_path" "$release_branch"; then
|
||||
release_fail "path $worktree_path already exists and is not a worktree for $release_branch."
|
||||
fi
|
||||
|
||||
if [ -z "$branch_worktree_path" ]; then
|
||||
if [ "$dry_run" = true ]; then
|
||||
if [ "$branch_exists_local" = true ] || [ "$branch_exists_remote" = true ]; then
|
||||
release_info "[dry-run] Would add worktree $worktree_path for existing branch $release_branch"
|
||||
else
|
||||
release_info "[dry-run] Would create branch $release_branch from $release_remote/master"
|
||||
release_info "[dry-run] Would add worktree $worktree_path"
|
||||
fi
|
||||
else
|
||||
if [ "$branch_exists_local" = true ]; then
|
||||
git -C "$REPO_ROOT" worktree add "$worktree_path" "$release_branch"
|
||||
elif [ "$branch_exists_remote" = true ]; then
|
||||
git -C "$REPO_ROOT" branch --track "$release_branch" "$release_remote/$release_branch"
|
||||
git -C "$REPO_ROOT" worktree add "$worktree_path" "$release_branch"
|
||||
created_branch=true
|
||||
else
|
||||
git -C "$REPO_ROOT" worktree add -b "$release_branch" "$worktree_path" "$release_remote/master"
|
||||
created_branch=true
|
||||
fi
|
||||
created_worktree=true
|
||||
fi
|
||||
fi
|
||||
|
||||
if [ "$dry_run" = false ] && [ "$push_branch" = true ] && [ "$branch_exists_remote" = false ]; then
|
||||
git -C "$worktree_path" push -u "$release_remote" "$release_branch"
|
||||
pushed_branch=true
|
||||
fi
|
||||
|
||||
if [ "$dry_run" = false ] && [ "$branch_exists_remote" = true ]; then
|
||||
git -C "$worktree_path" branch --set-upstream-to "$release_remote/$release_branch" "$release_branch" >/dev/null 2>&1 || true
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
release_info "==> Release train"
|
||||
release_info " Remote: $release_remote"
|
||||
release_info " Last stable tag: ${last_stable_tag:-<none>}"
|
||||
release_info " Current stable version: $current_stable_version"
|
||||
release_info " Bump: $bump_type"
|
||||
release_info " Target stable version: $target_stable_version"
|
||||
release_info " Next canary version: $target_canary_version"
|
||||
release_info " Branch: $release_branch"
|
||||
release_info " Tag (reserved until stable publish): $release_tag"
|
||||
release_info " Worktree: $worktree_path"
|
||||
release_info " Release notes path: $worktree_path/releases/v${target_stable_version}.md"
|
||||
|
||||
release_info ""
|
||||
release_info "==> Status"
|
||||
if [ -n "$branch_worktree_path" ]; then
|
||||
release_info " ✓ Reusing existing worktree for $release_branch"
|
||||
elif [ "$dry_run" = true ]; then
|
||||
release_info " ✓ Dry run only; no branch or worktree created"
|
||||
else
|
||||
[ "$created_branch" = true ] && release_info " ✓ Created branch $release_branch"
|
||||
[ "$created_worktree" = true ] && release_info " ✓ Created worktree $worktree_path"
|
||||
fi
|
||||
|
||||
if [ "$branch_exists_remote" = true ]; then
|
||||
release_info " ✓ Remote branch already exists on $release_remote"
|
||||
elif [ "$dry_run" = true ] && [ "$push_branch" = true ]; then
|
||||
release_info " [dry-run] Would push $release_branch to $release_remote"
|
||||
elif [ "$push_branch" = true ] && [ "$pushed_branch" = true ]; then
|
||||
release_info " ✓ Pushed $release_branch to $release_remote"
|
||||
elif [ "$push_branch" = false ]; then
|
||||
release_warn "release branch was not pushed. Stable publish will later refuse until the branch exists on $release_remote."
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
release_info "Next steps:"
|
||||
release_info " cd $worktree_path"
|
||||
release_info " Draft or update releases/v${target_stable_version}.md"
|
||||
release_info " ./scripts/release-preflight.sh canary $bump_type"
|
||||
release_info " ./scripts/release.sh $bump_type --canary"
|
||||
release_info ""
|
||||
release_info "Merge rule:"
|
||||
release_info " Merge $release_branch back to master without squash or rebase so tag $release_tag remains reachable from master."
|
||||
@@ -1,80 +1,46 @@
|
||||
#!/usr/bin/env bash
|
||||
set -euo pipefail
|
||||
|
||||
# release.sh — Prepare and publish a Paperclip release.
|
||||
#
|
||||
# Stable release:
|
||||
# ./scripts/release.sh patch
|
||||
# ./scripts/release.sh minor --dry-run
|
||||
#
|
||||
# Canary release:
|
||||
# ./scripts/release.sh patch --canary
|
||||
# ./scripts/release.sh minor --canary --dry-run
|
||||
#
|
||||
# Canary releases publish prerelease versions such as 1.2.3-canary.0 under the
|
||||
# npm dist-tag "canary". Stable releases publish 1.2.3 under "latest".
|
||||
|
||||
REPO_ROOT="$(cd "$(dirname "$0")/.." && pwd)"
|
||||
# shellcheck source=./release-lib.sh
|
||||
. "$REPO_ROOT/scripts/release-lib.sh"
|
||||
CLI_DIR="$REPO_ROOT/cli"
|
||||
TEMP_CHANGESET_FILE="$REPO_ROOT/.changeset/release-bump.md"
|
||||
TEMP_PRE_FILE="$REPO_ROOT/.changeset/pre.json"
|
||||
|
||||
channel=""
|
||||
release_date=""
|
||||
dry_run=false
|
||||
canary=false
|
||||
bump_type=""
|
||||
skip_verify=false
|
||||
print_version_only=false
|
||||
tag_name=""
|
||||
|
||||
cleanup_on_exit=false
|
||||
|
||||
usage() {
|
||||
cat <<'EOF'
|
||||
Usage:
|
||||
./scripts/release.sh <patch|minor|major> [--canary] [--dry-run]
|
||||
./scripts/release.sh <canary|stable> [--date YYYY-MM-DD] [--dry-run] [--skip-verify] [--print-version]
|
||||
|
||||
Examples:
|
||||
./scripts/release.sh patch
|
||||
./scripts/release.sh minor --dry-run
|
||||
./scripts/release.sh patch --canary
|
||||
./scripts/release.sh minor --canary --dry-run
|
||||
./scripts/release.sh canary
|
||||
./scripts/release.sh canary --date 2026-03-17 --dry-run
|
||||
./scripts/release.sh stable
|
||||
./scripts/release.sh stable --date 2026-03-17 --dry-run
|
||||
./scripts/release.sh stable --date 2026-03-18 --print-version
|
||||
|
||||
Notes:
|
||||
- Canary publishes prerelease versions like 1.2.3-canary.0 under the npm
|
||||
dist-tag "canary".
|
||||
- Stable publishes 1.2.3 under the npm dist-tag "latest".
|
||||
- Run this from branch release/X.Y.Z matching the computed target version.
|
||||
- Dry runs leave the working tree clean.
|
||||
- Stable versions use YYYY.MDD.P, where M is the UTC month, DD is the
|
||||
zero-padded UTC day, and P is the same-day stable patch slot.
|
||||
- Canary releases publish YYYY.MDD.P-canary.N under the npm dist-tag
|
||||
"canary" and create the git tag canary/vYYYY.MDD.P-canary.N.
|
||||
- Stable releases publish YYYY.MDD.P under the npm dist-tag "latest" and
|
||||
create the git tag vYYYY.MDD.P.
|
||||
- Stable release notes must already exist at releases/vYYYY.MDD.P.md.
|
||||
- The script rewrites versions temporarily and restores the working tree on
|
||||
exit. Tags always point at the original source commit, not a generated
|
||||
release commit.
|
||||
EOF
|
||||
}
|
||||
|
||||
while [ $# -gt 0 ]; do
|
||||
case "$1" in
|
||||
--dry-run) dry_run=true ;;
|
||||
--canary) canary=true ;;
|
||||
-h|--help)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
--promote)
|
||||
echo "Error: --promote was removed. Re-run a stable release from the vetted commit instead."
|
||||
exit 1
|
||||
;;
|
||||
*)
|
||||
if [ -n "$bump_type" ]; then
|
||||
echo "Error: only one bump type may be provided."
|
||||
exit 1
|
||||
fi
|
||||
bump_type="$1"
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
if [[ ! "$bump_type" =~ ^(patch|minor|major)$ ]]; then
|
||||
usage
|
||||
exit 1
|
||||
fi
|
||||
|
||||
restore_publish_artifacts() {
|
||||
if [ -f "$CLI_DIR/package.dev.json" ]; then
|
||||
mv "$CLI_DIR/package.dev.json" "$CLI_DIR/package.json"
|
||||
@@ -91,8 +57,6 @@ restore_publish_artifacts() {
|
||||
cleanup_release_state() {
|
||||
restore_publish_artifacts
|
||||
|
||||
rm -f "$TEMP_CHANGESET_FILE" "$TEMP_PRE_FILE"
|
||||
|
||||
tracked_changes="$(git -C "$REPO_ROOT" diff --name-only; git -C "$REPO_ROOT" diff --cached --name-only)"
|
||||
if [ -n "$tracked_changes" ]; then
|
||||
printf '%s\n' "$tracked_changes" | sort -u | while IFS= read -r path; do
|
||||
@@ -114,260 +78,140 @@ cleanup_release_state() {
|
||||
fi
|
||||
}
|
||||
|
||||
if [ "$cleanup_on_exit" = true ]; then
|
||||
trap cleanup_release_state EXIT
|
||||
fi
|
||||
|
||||
set_cleanup_trap() {
|
||||
cleanup_on_exit=true
|
||||
trap cleanup_release_state EXIT
|
||||
}
|
||||
|
||||
require_npm_publish_auth() {
|
||||
if [ "$dry_run" = true ]; then
|
||||
return
|
||||
fi
|
||||
while [ $# -gt 0 ]; do
|
||||
case "$1" in
|
||||
canary|stable)
|
||||
if [ -n "$channel" ]; then
|
||||
release_fail "only one release channel may be provided."
|
||||
fi
|
||||
channel="$1"
|
||||
;;
|
||||
--date)
|
||||
shift
|
||||
[ $# -gt 0 ] || release_fail "--date requires YYYY-MM-DD."
|
||||
release_date="$1"
|
||||
;;
|
||||
--dry-run) dry_run=true ;;
|
||||
--skip-verify) skip_verify=true ;;
|
||||
--print-version) print_version_only=true ;;
|
||||
-h|--help)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
release_fail "unexpected argument: $1"
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
if npm whoami >/dev/null 2>&1; then
|
||||
release_info " ✓ Logged in to npm as $(npm whoami)"
|
||||
return
|
||||
fi
|
||||
|
||||
if [ "${GITHUB_ACTIONS:-}" = "true" ]; then
|
||||
release_info " ✓ npm publish auth will be provided by GitHub Actions trusted publishing"
|
||||
return
|
||||
fi
|
||||
|
||||
release_fail "npm publish auth is not available. Use 'npm login' locally or run from the GitHub release workflow."
|
||||
}
|
||||
|
||||
list_public_package_info() {
|
||||
node - "$REPO_ROOT" <<'NODE'
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const root = process.argv[2];
|
||||
const roots = ['packages', 'server', 'ui', 'cli'];
|
||||
const seen = new Set();
|
||||
const rows = [];
|
||||
|
||||
function walk(relDir) {
|
||||
const absDir = path.join(root, relDir);
|
||||
const pkgPath = path.join(absDir, 'package.json');
|
||||
|
||||
if (fs.existsSync(pkgPath)) {
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf8'));
|
||||
if (!pkg.private) {
|
||||
rows.push([relDir, pkg.name]);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (!fs.existsSync(absDir)) {
|
||||
return;
|
||||
}
|
||||
|
||||
for (const entry of fs.readdirSync(absDir, { withFileTypes: true })) {
|
||||
if (!entry.isDirectory()) continue;
|
||||
if (entry.name === 'node_modules' || entry.name === 'dist' || entry.name === '.git') continue;
|
||||
walk(path.join(relDir, entry.name));
|
||||
}
|
||||
}
|
||||
|
||||
for (const rel of roots) {
|
||||
walk(rel);
|
||||
}
|
||||
|
||||
rows.sort((a, b) => a[0].localeCompare(b[0]));
|
||||
|
||||
for (const [dir, name] of rows) {
|
||||
const pkgPath = path.join(root, dir, 'package.json');
|
||||
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf8'));
|
||||
const key = `${dir}\t${name}\t${pkg.version}`;
|
||||
if (seen.has(key)) continue;
|
||||
seen.add(key);
|
||||
process.stdout.write(`${dir}\t${name}\t${pkg.version}\n`);
|
||||
}
|
||||
NODE
|
||||
}
|
||||
|
||||
replace_version_string() {
|
||||
local from_version="$1"
|
||||
local to_version="$2"
|
||||
|
||||
node - "$REPO_ROOT" "$from_version" "$to_version" <<'NODE'
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
const root = process.argv[2];
|
||||
const fromVersion = process.argv[3];
|
||||
const toVersion = process.argv[4];
|
||||
|
||||
const roots = ['packages', 'server', 'ui', 'cli'];
|
||||
const targets = new Set(['package.json', 'CHANGELOG.md']);
|
||||
const extraFiles = [path.join('cli', 'src', 'index.ts')];
|
||||
|
||||
function rewriteFile(filePath) {
|
||||
if (!fs.existsSync(filePath)) return;
|
||||
const current = fs.readFileSync(filePath, 'utf8');
|
||||
if (!current.includes(fromVersion)) return;
|
||||
fs.writeFileSync(filePath, current.split(fromVersion).join(toVersion));
|
||||
}
|
||||
|
||||
function walk(relDir) {
|
||||
const absDir = path.join(root, relDir);
|
||||
if (!fs.existsSync(absDir)) return;
|
||||
|
||||
for (const entry of fs.readdirSync(absDir, { withFileTypes: true })) {
|
||||
if (entry.isDirectory()) {
|
||||
if (entry.name === 'node_modules' || entry.name === 'dist' || entry.name === '.git') continue;
|
||||
walk(path.join(relDir, entry.name));
|
||||
continue;
|
||||
}
|
||||
|
||||
if (targets.has(entry.name)) {
|
||||
rewriteFile(path.join(absDir, entry.name));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (const rel of roots) {
|
||||
walk(rel);
|
||||
}
|
||||
|
||||
for (const relFile of extraFiles) {
|
||||
rewriteFile(path.join(root, relFile));
|
||||
}
|
||||
NODE
|
||||
[ -n "$channel" ] || {
|
||||
usage
|
||||
exit 1
|
||||
}
|
||||
|
||||
PUBLISH_REMOTE="$(resolve_release_remote)"
|
||||
fetch_release_remote "$PUBLISH_REMOTE"
|
||||
|
||||
CURRENT_BRANCH="$(git_current_branch)"
|
||||
CURRENT_SHA="$(git -C "$REPO_ROOT" rev-parse HEAD)"
|
||||
LAST_STABLE_TAG="$(get_last_stable_tag)"
|
||||
CURRENT_STABLE_VERSION="$(get_current_stable_version)"
|
||||
RELEASE_DATE="${release_date:-$(utc_date_iso)}"
|
||||
|
||||
TARGET_STABLE_VERSION="$(compute_bumped_version "$CURRENT_STABLE_VERSION" "$bump_type")"
|
||||
PUBLIC_PACKAGE_INFO="$(list_public_package_info)"
|
||||
PUBLIC_PACKAGE_NAMES=()
|
||||
while IFS= read -r package_name; do
|
||||
[ -n "$package_name" ] || continue
|
||||
PUBLIC_PACKAGE_NAMES+=("$package_name")
|
||||
done < <(printf '%s\n' "$PUBLIC_PACKAGE_INFO" | cut -f2)
|
||||
|
||||
[ -n "$PUBLIC_PACKAGE_INFO" ] || release_fail "no public packages were found in the workspace."
|
||||
|
||||
TARGET_STABLE_VERSION="$(next_stable_version "$RELEASE_DATE" "${PUBLIC_PACKAGE_NAMES[@]}")"
|
||||
TARGET_PUBLISH_VERSION="$TARGET_STABLE_VERSION"
|
||||
CURRENT_BRANCH="$(git_current_branch)"
|
||||
EXPECTED_RELEASE_BRANCH="$(release_branch_name "$TARGET_STABLE_VERSION")"
|
||||
DIST_TAG="latest"
|
||||
|
||||
if [ "$channel" = "canary" ]; then
|
||||
require_on_master_branch
|
||||
TARGET_PUBLISH_VERSION="$(next_canary_version "$TARGET_STABLE_VERSION" "${PUBLIC_PACKAGE_NAMES[@]}")"
|
||||
DIST_TAG="canary"
|
||||
tag_name="$(canary_tag_name "$TARGET_PUBLISH_VERSION")"
|
||||
else
|
||||
tag_name="$(stable_tag_name "$TARGET_STABLE_VERSION")"
|
||||
fi
|
||||
|
||||
if [ "$print_version_only" = true ]; then
|
||||
printf '%s\n' "$TARGET_PUBLISH_VERSION"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
NOTES_FILE="$(release_notes_file "$TARGET_STABLE_VERSION")"
|
||||
RELEASE_TAG="v$TARGET_STABLE_VERSION"
|
||||
|
||||
if [ "$canary" = true ]; then
|
||||
TARGET_PUBLISH_VERSION="$(next_canary_version "$TARGET_STABLE_VERSION")"
|
||||
fi
|
||||
|
||||
if [ "$TARGET_STABLE_VERSION" = "$CURRENT_STABLE_VERSION" ]; then
|
||||
release_fail "next stable version matches the current stable version. Refusing to publish."
|
||||
fi
|
||||
|
||||
if [[ "$TARGET_PUBLISH_VERSION" == "${CURRENT_STABLE_VERSION}-canary."* ]]; then
|
||||
release_fail "canary versions must be derived from the next stable version, never ${CURRENT_STABLE_VERSION}-canary.N."
|
||||
fi
|
||||
|
||||
require_clean_worktree
|
||||
ensure_release_branch_for_version "$TARGET_STABLE_VERSION"
|
||||
require_npm_publish_auth "$dry_run"
|
||||
|
||||
if git_local_tag_exists "$RELEASE_TAG" || git_remote_tag_exists "$RELEASE_TAG" "$PUBLISH_REMOTE"; then
|
||||
release_fail "release train $EXPECTED_RELEASE_BRANCH is frozen because tag $RELEASE_TAG already exists locally or on $PUBLISH_REMOTE."
|
||||
fi
|
||||
|
||||
if npm_version_exists "$TARGET_STABLE_VERSION"; then
|
||||
release_fail "stable version $TARGET_STABLE_VERSION is already published on npm. Refusing to reuse release train $EXPECTED_RELEASE_BRANCH."
|
||||
fi
|
||||
|
||||
if [ "$canary" = false ] && [ ! -f "$NOTES_FILE" ]; then
|
||||
if [ "$channel" = "stable" ] && [ ! -f "$NOTES_FILE" ]; then
|
||||
release_fail "stable release notes file is required at $NOTES_FILE before publishing stable."
|
||||
fi
|
||||
|
||||
if [ "$canary" = true ] && [ ! -f "$NOTES_FILE" ]; then
|
||||
release_warn "stable release notes file is missing at $NOTES_FILE. Draft it before you finalize stable."
|
||||
if [ "$channel" = "canary" ] && [ -f "$NOTES_FILE" ]; then
|
||||
release_info " ✓ Stable release notes already exist at $NOTES_FILE"
|
||||
fi
|
||||
|
||||
if ! git_remote_branch_exists "$EXPECTED_RELEASE_BRANCH" "$PUBLISH_REMOTE"; then
|
||||
if [ "$canary" = false ] && [ "$dry_run" = false ]; then
|
||||
release_fail "remote branch $EXPECTED_RELEASE_BRANCH does not exist on $PUBLISH_REMOTE. Run ./scripts/release-start.sh $bump_type first or push the branch before stable publish."
|
||||
if git_local_tag_exists "$tag_name" || git_remote_tag_exists "$tag_name" "$PUBLISH_REMOTE"; then
|
||||
release_fail "git tag $tag_name already exists locally or on $PUBLISH_REMOTE."
|
||||
fi
|
||||
|
||||
while IFS= read -r package_name; do
|
||||
[ -z "$package_name" ] && continue
|
||||
if npm_package_version_exists "$package_name" "$TARGET_PUBLISH_VERSION"; then
|
||||
release_fail "npm version ${package_name}@${TARGET_PUBLISH_VERSION} already exists."
|
||||
fi
|
||||
release_warn "remote branch $EXPECTED_RELEASE_BRANCH does not exist on $PUBLISH_REMOTE yet."
|
||||
fi
|
||||
|
||||
PUBLIC_PACKAGE_INFO="$(list_public_package_info)"
|
||||
PUBLIC_PACKAGE_NAMES="$(printf '%s\n' "$PUBLIC_PACKAGE_INFO" | cut -f2)"
|
||||
PUBLIC_PACKAGE_DIRS="$(printf '%s\n' "$PUBLIC_PACKAGE_INFO" | cut -f1)"
|
||||
|
||||
if [ -z "$PUBLIC_PACKAGE_INFO" ]; then
|
||||
release_fail "no public packages were found in the workspace."
|
||||
fi
|
||||
done <<< "$(printf '%s\n' "${PUBLIC_PACKAGE_NAMES[@]}")"
|
||||
|
||||
release_info ""
|
||||
release_info "==> Release plan"
|
||||
release_info " Remote: $PUBLISH_REMOTE"
|
||||
release_info " Channel: $channel"
|
||||
release_info " Current branch: ${CURRENT_BRANCH:-<detached>}"
|
||||
release_info " Expected branch: $EXPECTED_RELEASE_BRANCH"
|
||||
release_info " Source commit: $CURRENT_SHA"
|
||||
release_info " Last stable tag: ${LAST_STABLE_TAG:-<none>}"
|
||||
release_info " Current stable version: $CURRENT_STABLE_VERSION"
|
||||
if [ "$canary" = true ]; then
|
||||
release_info " Target stable version: $TARGET_STABLE_VERSION"
|
||||
release_info " Release date (UTC): $RELEASE_DATE"
|
||||
release_info " Target stable version: $TARGET_STABLE_VERSION"
|
||||
if [ "$channel" = "canary" ]; then
|
||||
release_info " Canary version: $TARGET_PUBLISH_VERSION"
|
||||
release_info " Guard: canary is derived from next stable version, not ${CURRENT_STABLE_VERSION}-canary.N"
|
||||
else
|
||||
release_info " Stable version: $TARGET_STABLE_VERSION"
|
||||
release_info " Stable version: $TARGET_PUBLISH_VERSION"
|
||||
fi
|
||||
release_info " Dist-tag: $DIST_TAG"
|
||||
release_info " Git tag: $tag_name"
|
||||
if [ "$channel" = "stable" ]; then
|
||||
release_info " Release notes: $NOTES_FILE"
|
||||
fi
|
||||
|
||||
set_cleanup_trap
|
||||
|
||||
if [ "$skip_verify" = false ]; then
|
||||
release_info ""
|
||||
release_info "==> Step 1/7: Verification gate..."
|
||||
cd "$REPO_ROOT"
|
||||
pnpm -r typecheck
|
||||
pnpm test:run
|
||||
pnpm build
|
||||
else
|
||||
release_info ""
|
||||
release_info "==> Step 1/7: Verification gate skipped (--skip-verify)"
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
release_info "==> Step 1/7: Preflight checks..."
|
||||
release_info " ✓ Working tree is clean"
|
||||
release_info " ✓ Branch matches release train"
|
||||
require_npm_publish_auth
|
||||
|
||||
if [ "$dry_run" = true ] || [ "$canary" = true ]; then
|
||||
set_cleanup_trap
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
release_info "==> Step 2/7: Creating release changeset..."
|
||||
{
|
||||
echo "---"
|
||||
while IFS= read -r pkg_name; do
|
||||
[ -z "$pkg_name" ] && continue
|
||||
echo "\"$pkg_name\": $bump_type"
|
||||
done <<< "$PUBLIC_PACKAGE_NAMES"
|
||||
echo "---"
|
||||
echo ""
|
||||
if [ "$canary" = true ]; then
|
||||
echo "Canary release preparation for $TARGET_STABLE_VERSION"
|
||||
else
|
||||
echo "Stable release preparation for $TARGET_STABLE_VERSION"
|
||||
fi
|
||||
} > "$TEMP_CHANGESET_FILE"
|
||||
release_info " ✓ Created release changeset for $(printf '%s\n' "$PUBLIC_PACKAGE_NAMES" | sed '/^$/d' | wc -l | xargs) packages"
|
||||
|
||||
release_info ""
|
||||
release_info "==> Step 3/7: Versioning packages..."
|
||||
cd "$REPO_ROOT"
|
||||
if [ "$canary" = true ]; then
|
||||
npx changeset pre enter canary
|
||||
fi
|
||||
npx changeset version
|
||||
|
||||
if [ "$canary" = true ]; then
|
||||
BASE_CANARY_VERSION="${TARGET_STABLE_VERSION}-canary.0"
|
||||
if [ "$TARGET_PUBLISH_VERSION" != "$BASE_CANARY_VERSION" ]; then
|
||||
replace_version_string "$BASE_CANARY_VERSION" "$TARGET_PUBLISH_VERSION"
|
||||
fi
|
||||
fi
|
||||
|
||||
VERSIONED_PACKAGE_INFO="$(list_public_package_info)"
|
||||
|
||||
VERSION_IN_CLI_PACKAGE="$(node -e "console.log(require('$CLI_DIR/package.json').version)")"
|
||||
if [ "$VERSION_IN_CLI_PACKAGE" != "$TARGET_PUBLISH_VERSION" ]; then
|
||||
release_fail "versioning drift detected. Expected $TARGET_PUBLISH_VERSION but found $VERSION_IN_CLI_PACKAGE."
|
||||
fi
|
||||
release_info " ✓ Versioned workspace to $TARGET_PUBLISH_VERSION"
|
||||
|
||||
release_info ""
|
||||
release_info "==> Step 4/7: Building workspace artifacts..."
|
||||
release_info "==> Step 2/7: Building workspace artifacts..."
|
||||
cd "$REPO_ROOT"
|
||||
pnpm build
|
||||
bash "$REPO_ROOT/scripts/prepare-server-ui-dist.sh"
|
||||
@@ -378,42 +222,52 @@ done
|
||||
release_info " ✓ Workspace build complete"
|
||||
|
||||
release_info ""
|
||||
release_info "==> Step 5/7: Building publishable CLI bundle..."
|
||||
"$REPO_ROOT/scripts/build-npm.sh" --skip-checks
|
||||
release_info "==> Step 3/7: Rewriting workspace versions..."
|
||||
set_public_package_version "$TARGET_PUBLISH_VERSION"
|
||||
release_info " ✓ Versioned workspace to $TARGET_PUBLISH_VERSION"
|
||||
|
||||
release_info ""
|
||||
release_info "==> Step 4/7: Building publishable CLI bundle..."
|
||||
"$REPO_ROOT/scripts/build-npm.sh" --skip-checks --skip-typecheck
|
||||
release_info " ✓ CLI bundle ready"
|
||||
|
||||
VERSIONED_PACKAGE_INFO="$(list_public_package_info)"
|
||||
VERSION_IN_CLI_PACKAGE="$(node -e "console.log(require('$CLI_DIR/package.json').version)")"
|
||||
if [ "$VERSION_IN_CLI_PACKAGE" != "$TARGET_PUBLISH_VERSION" ]; then
|
||||
release_fail "versioning drift detected. Expected $TARGET_PUBLISH_VERSION but found $VERSION_IN_CLI_PACKAGE."
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
if [ "$dry_run" = true ]; then
|
||||
release_info "==> Step 6/7: Previewing publish payloads (--dry-run)..."
|
||||
while IFS= read -r pkg_dir; do
|
||||
release_info "==> Step 5/7: Previewing publish payloads (--dry-run)..."
|
||||
while IFS=$'\t' read -r pkg_dir _pkg_name _pkg_version; do
|
||||
[ -z "$pkg_dir" ] && continue
|
||||
release_info " --- $pkg_dir ---"
|
||||
cd "$REPO_ROOT/$pkg_dir"
|
||||
npm pack --dry-run 2>&1 | tail -3
|
||||
done <<< "$PUBLIC_PACKAGE_DIRS"
|
||||
cd "$REPO_ROOT"
|
||||
if [ "$canary" = true ]; then
|
||||
release_info " [dry-run] Would publish ${TARGET_PUBLISH_VERSION} under dist-tag canary"
|
||||
else
|
||||
release_info " [dry-run] Would publish ${TARGET_PUBLISH_VERSION} under dist-tag latest"
|
||||
fi
|
||||
pnpm publish --dry-run --no-git-checks --tag "$DIST_TAG" 2>&1 | tail -3
|
||||
done <<< "$VERSIONED_PACKAGE_INFO"
|
||||
release_info " [dry-run] Would create git tag $tag_name on $CURRENT_SHA"
|
||||
else
|
||||
if [ "$canary" = true ]; then
|
||||
release_info "==> Step 6/7: Publishing canary to npm..."
|
||||
npx changeset publish
|
||||
release_info " ✓ Published ${TARGET_PUBLISH_VERSION} under dist-tag canary"
|
||||
else
|
||||
release_info "==> Step 6/7: Publishing stable release to npm..."
|
||||
npx changeset publish
|
||||
release_info " ✓ Published ${TARGET_PUBLISH_VERSION} under dist-tag latest"
|
||||
fi
|
||||
release_info "==> Step 5/7: Publishing packages to npm..."
|
||||
while IFS=$'\t' read -r pkg_dir pkg_name pkg_version; do
|
||||
[ -z "$pkg_dir" ] && continue
|
||||
release_info " Publishing $pkg_name@$pkg_version"
|
||||
cd "$REPO_ROOT/$pkg_dir"
|
||||
pnpm publish --no-git-checks --tag "$DIST_TAG" --access public
|
||||
done <<< "$VERSIONED_PACKAGE_INFO"
|
||||
release_info " ✓ Published all packages under dist-tag $DIST_TAG"
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
release_info "==> Post-publish verification: Confirming npm package availability..."
|
||||
release_info ""
|
||||
if [ "$dry_run" = true ]; then
|
||||
release_info "==> Step 6/7: Skipping npm verification in dry-run mode..."
|
||||
else
|
||||
release_info "==> Step 6/7: Confirming npm package availability..."
|
||||
VERIFY_ATTEMPTS="${NPM_PUBLISH_VERIFY_ATTEMPTS:-12}"
|
||||
VERIFY_DELAY_SECONDS="${NPM_PUBLISH_VERIFY_DELAY_SECONDS:-5}"
|
||||
MISSING_PUBLISHED_PACKAGES=""
|
||||
while IFS=$'\t' read -r pkg_dir pkg_name pkg_version; do
|
||||
|
||||
while IFS=$'\t' read -r _pkg_dir pkg_name pkg_version; do
|
||||
[ -z "$pkg_name" ] && continue
|
||||
release_info " Checking $pkg_name@$pkg_version"
|
||||
if wait_for_npm_package_version "$pkg_name" "$pkg_version" "$VERIFY_ATTEMPTS" "$VERIFY_DELAY_SECONDS"; then
|
||||
@@ -427,49 +281,32 @@ else
|
||||
MISSING_PUBLISHED_PACKAGES="${MISSING_PUBLISHED_PACKAGES}${pkg_name}@${pkg_version}"
|
||||
done <<< "$VERSIONED_PACKAGE_INFO"
|
||||
|
||||
if [ -n "$MISSING_PUBLISHED_PACKAGES" ]; then
|
||||
release_fail "publish completed but npm never exposed: $MISSING_PUBLISHED_PACKAGES. Inspect the changeset publish output before treating this release as good."
|
||||
fi
|
||||
[ -z "$MISSING_PUBLISHED_PACKAGES" ] || release_fail "publish completed but npm never exposed: $MISSING_PUBLISHED_PACKAGES"
|
||||
|
||||
release_info " ✓ Verified all versioned packages are available on npm"
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
if [ "$dry_run" = true ]; then
|
||||
release_info "==> Step 7/7: Cleaning up dry-run state..."
|
||||
release_info " ✓ Dry run leaves the working tree unchanged"
|
||||
elif [ "$canary" = true ]; then
|
||||
release_info "==> Step 7/7: Cleaning up canary state..."
|
||||
release_info " ✓ Canary state will be discarded after publish"
|
||||
release_info "==> Step 7/7: Dry run complete..."
|
||||
else
|
||||
release_info "==> Step 7/7: Finalizing stable release commit..."
|
||||
restore_publish_artifacts
|
||||
|
||||
git -C "$REPO_ROOT" add -u .changeset packages server cli
|
||||
if [ -f "$REPO_ROOT/releases/v${TARGET_STABLE_VERSION}.md" ]; then
|
||||
git -C "$REPO_ROOT" add "releases/v${TARGET_STABLE_VERSION}.md"
|
||||
fi
|
||||
|
||||
git -C "$REPO_ROOT" commit -m "chore: release v$TARGET_STABLE_VERSION"
|
||||
git -C "$REPO_ROOT" tag "v$TARGET_STABLE_VERSION"
|
||||
release_info " ✓ Created commit and tag v$TARGET_STABLE_VERSION"
|
||||
release_info "==> Step 7/7: Creating git tag..."
|
||||
git -C "$REPO_ROOT" tag "$tag_name" "$CURRENT_SHA"
|
||||
release_info " ✓ Created tag $tag_name on $CURRENT_SHA"
|
||||
fi
|
||||
|
||||
release_info ""
|
||||
if [ "$dry_run" = true ]; then
|
||||
if [ "$canary" = true ]; then
|
||||
release_info "Dry run complete for canary ${TARGET_PUBLISH_VERSION}."
|
||||
else
|
||||
release_info "Dry run complete for stable v${TARGET_STABLE_VERSION}."
|
||||
fi
|
||||
elif [ "$canary" = true ]; then
|
||||
release_info "Published canary ${TARGET_PUBLISH_VERSION}."
|
||||
release_info "Install with: npx paperclipai@canary onboard"
|
||||
release_info "Stable version remains: $CURRENT_STABLE_VERSION"
|
||||
release_info "Dry run complete for $channel ${TARGET_PUBLISH_VERSION}."
|
||||
else
|
||||
release_info "Published stable v${TARGET_STABLE_VERSION}."
|
||||
release_info "Next steps:"
|
||||
release_info " git push ${PUBLISH_REMOTE} HEAD --follow-tags"
|
||||
release_info " ./scripts/create-github-release.sh $TARGET_STABLE_VERSION"
|
||||
release_info " Open a PR from ${EXPECTED_RELEASE_BRANCH} to master and merge without squash or rebase"
|
||||
if [ "$channel" = "canary" ]; then
|
||||
release_info "Published canary ${TARGET_PUBLISH_VERSION}."
|
||||
release_info "Install with: npx paperclipai@canary onboard"
|
||||
release_info "Next step: git push ${PUBLISH_REMOTE} refs/tags/${tag_name}"
|
||||
else
|
||||
release_info "Published stable ${TARGET_PUBLISH_VERSION}."
|
||||
release_info "Next steps:"
|
||||
release_info " git push ${PUBLISH_REMOTE} refs/tags/${tag_name}"
|
||||
release_info " ./scripts/create-github-release.sh $TARGET_STABLE_VERSION"
|
||||
fi
|
||||
fi
|
||||
|
||||
@@ -12,8 +12,8 @@ Usage:
|
||||
./scripts/rollback-latest.sh <stable-version> [--dry-run]
|
||||
|
||||
Examples:
|
||||
./scripts/rollback-latest.sh 1.2.2
|
||||
./scripts/rollback-latest.sh 1.2.2 --dry-run
|
||||
./scripts/rollback-latest.sh 2026.318.0
|
||||
./scripts/rollback-latest.sh 2026.318.0 --dry-run
|
||||
|
||||
Notes:
|
||||
- This repoints the npm dist-tag "latest" for every public package.
|
||||
@@ -45,7 +45,7 @@ if [ -z "$version" ]; then
|
||||
fi
|
||||
|
||||
if [[ ! "$version" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]]; then
|
||||
echo "Error: version must be a stable semver like 1.2.2." >&2
|
||||
echo "Error: version must be a stable calendar version like 2026.318.0." >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
|
||||
@@ -2,6 +2,7 @@ import { describe, expect, it } from "vitest";
|
||||
import type { agents } from "@paperclipai/db";
|
||||
import { resolveDefaultAgentWorkspaceDir } from "../home-paths.js";
|
||||
import {
|
||||
formatRuntimeWorkspaceWarningLog,
|
||||
prioritizeProjectWorkspaceCandidatesForRun,
|
||||
parseSessionCompactionPolicy,
|
||||
resolveRuntimeSessionParamsForWorkspace,
|
||||
@@ -181,6 +182,15 @@ describe("shouldResetTaskSessionForWake", () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe("formatRuntimeWorkspaceWarningLog", () => {
|
||||
it("emits informational workspace warnings on stdout", () => {
|
||||
expect(formatRuntimeWorkspaceWarningLog("Using fallback workspace")).toEqual({
|
||||
stream: "stdout",
|
||||
chunk: "[paperclip] Using fallback workspace\n",
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe("prioritizeProjectWorkspaceCandidatesForRun", () => {
|
||||
it("moves the explicitly selected workspace to the front", () => {
|
||||
const rows = [
|
||||
|
||||
@@ -498,6 +498,13 @@ export function shouldResetTaskSessionForWake(
|
||||
return false;
|
||||
}
|
||||
|
||||
export function formatRuntimeWorkspaceWarningLog(warning: string) {
|
||||
return {
|
||||
stream: "stdout" as const,
|
||||
chunk: `[paperclip] ${warning}\n`,
|
||||
};
|
||||
}
|
||||
|
||||
function describeSessionResetReason(
|
||||
contextSnapshot: Record<string, unknown> | null | undefined,
|
||||
) {
|
||||
@@ -2062,7 +2069,8 @@ export function heartbeatService(db: Db) {
|
||||
});
|
||||
};
|
||||
for (const warning of runtimeWorkspaceWarnings) {
|
||||
await onLog("stderr", `[paperclip] ${warning}\n`);
|
||||
const logEntry = formatRuntimeWorkspaceWarningLog(warning);
|
||||
await onLog(logEntry.stream, logEntry.chunk);
|
||||
}
|
||||
const adapterEnv = Object.fromEntries(
|
||||
Object.entries(parseObject(resolvedConfig.env)).filter(
|
||||
|
||||
@@ -22,14 +22,11 @@ const TASK_TITLE = "E2E test task";
|
||||
|
||||
test.describe("Onboarding wizard", () => {
|
||||
test("completes full wizard flow", async ({ page }) => {
|
||||
// Navigate to root — should auto-open onboarding when no companies exist
|
||||
await page.goto("/");
|
||||
|
||||
// If the wizard didn't auto-open (company already exists), click the button
|
||||
const wizardHeading = page.locator("h3", { hasText: "Name your company" });
|
||||
const newCompanyBtn = page.getByRole("button", { name: "New Company" });
|
||||
|
||||
// Wait for either the wizard or the start page
|
||||
await expect(
|
||||
wizardHeading.or(newCompanyBtn)
|
||||
).toBeVisible({ timeout: 15_000 });
|
||||
@@ -38,40 +35,28 @@ test.describe("Onboarding wizard", () => {
|
||||
await newCompanyBtn.click();
|
||||
}
|
||||
|
||||
// -----------------------------------------------------------
|
||||
// Step 1: Name your company
|
||||
// -----------------------------------------------------------
|
||||
await expect(wizardHeading).toBeVisible({ timeout: 5_000 });
|
||||
await expect(page.locator("text=Step 1 of 4")).toBeVisible();
|
||||
|
||||
const companyNameInput = page.locator('input[placeholder="Acme Corp"]');
|
||||
await companyNameInput.fill(COMPANY_NAME);
|
||||
|
||||
// Click Next
|
||||
const nextButton = page.getByRole("button", { name: "Next" });
|
||||
await nextButton.click();
|
||||
|
||||
// -----------------------------------------------------------
|
||||
// Step 2: Create your first agent
|
||||
// -----------------------------------------------------------
|
||||
await expect(
|
||||
page.locator("h3", { hasText: "Create your first agent" })
|
||||
).toBeVisible({ timeout: 10_000 });
|
||||
await expect(page.locator("text=Step 2 of 4")).toBeVisible();
|
||||
|
||||
// Agent name should default to "CEO"
|
||||
const agentNameInput = page.locator('input[placeholder="CEO"]');
|
||||
await expect(agentNameInput).toHaveValue(AGENT_NAME);
|
||||
|
||||
// Claude Code adapter should be selected by default
|
||||
await expect(
|
||||
page.locator("button", { hasText: "Claude Code" }).locator("..")
|
||||
).toBeVisible();
|
||||
|
||||
// Select the "Process" adapter to avoid needing a real CLI tool installed
|
||||
await page.locator("button", { hasText: "Process" }).click();
|
||||
await page.getByRole("button", { name: "More Agent Adapter Types" }).click();
|
||||
await page.getByRole("button", { name: "Process" }).click();
|
||||
|
||||
// Fill in process adapter fields
|
||||
const commandInput = page.locator('input[placeholder="e.g. node, python"]');
|
||||
await commandInput.fill("echo");
|
||||
const argsInput = page.locator(
|
||||
@@ -79,52 +64,34 @@ test.describe("Onboarding wizard", () => {
|
||||
);
|
||||
await argsInput.fill("hello");
|
||||
|
||||
// Click Next (process adapter skips environment test)
|
||||
await page.getByRole("button", { name: "Next" }).click();
|
||||
|
||||
// -----------------------------------------------------------
|
||||
// Step 3: Give it something to do
|
||||
// -----------------------------------------------------------
|
||||
await expect(
|
||||
page.locator("h3", { hasText: "Give it something to do" })
|
||||
).toBeVisible({ timeout: 10_000 });
|
||||
await expect(page.locator("text=Step 3 of 4")).toBeVisible();
|
||||
|
||||
// Clear default title and set our test title
|
||||
const taskTitleInput = page.locator(
|
||||
'input[placeholder="e.g. Research competitor pricing"]'
|
||||
);
|
||||
await taskTitleInput.clear();
|
||||
await taskTitleInput.fill(TASK_TITLE);
|
||||
|
||||
// Click Next
|
||||
await page.getByRole("button", { name: "Next" }).click();
|
||||
|
||||
// -----------------------------------------------------------
|
||||
// Step 4: Ready to launch
|
||||
// -----------------------------------------------------------
|
||||
await expect(
|
||||
page.locator("h3", { hasText: "Ready to launch" })
|
||||
).toBeVisible({ timeout: 10_000 });
|
||||
await expect(page.locator("text=Step 4 of 4")).toBeVisible();
|
||||
|
||||
// Verify summary displays our created entities
|
||||
await expect(page.locator("text=" + COMPANY_NAME)).toBeVisible();
|
||||
await expect(page.locator("text=" + AGENT_NAME)).toBeVisible();
|
||||
await expect(page.locator("text=" + TASK_TITLE)).toBeVisible();
|
||||
|
||||
// Click "Open Issue"
|
||||
await page.getByRole("button", { name: "Open Issue" }).click();
|
||||
await page.getByRole("button", { name: "Create & Open Issue" }).click();
|
||||
|
||||
// Should navigate to the issue page
|
||||
await expect(page).toHaveURL(/\/issues\//, { timeout: 10_000 });
|
||||
|
||||
// -----------------------------------------------------------
|
||||
// Verify via API that entities were created
|
||||
// -----------------------------------------------------------
|
||||
const baseUrl = page.url().split("/").slice(0, 3).join("/");
|
||||
|
||||
// List companies and find ours
|
||||
const companiesRes = await page.request.get(`${baseUrl}/api/companies`);
|
||||
expect(companiesRes.ok()).toBe(true);
|
||||
const companies = await companiesRes.json();
|
||||
@@ -133,7 +100,6 @@ test.describe("Onboarding wizard", () => {
|
||||
);
|
||||
expect(company).toBeTruthy();
|
||||
|
||||
// List agents for our company
|
||||
const agentsRes = await page.request.get(
|
||||
`${baseUrl}/api/companies/${company.id}/agents`
|
||||
);
|
||||
@@ -146,7 +112,6 @@ test.describe("Onboarding wizard", () => {
|
||||
expect(ceoAgent.role).toBe("ceo");
|
||||
expect(ceoAgent.adapterType).toBe("process");
|
||||
|
||||
// List issues for our company
|
||||
const issuesRes = await page.request.get(
|
||||
`${baseUrl}/api/companies/${company.id}/issues`
|
||||
);
|
||||
@@ -159,7 +124,6 @@ test.describe("Onboarding wizard", () => {
|
||||
expect(task.assigneeAgentId).toBe(ceoAgent.id);
|
||||
|
||||
if (!SKIP_LLM) {
|
||||
// LLM-dependent: wait for the heartbeat to transition the issue
|
||||
await expect(async () => {
|
||||
const res = await page.request.get(
|
||||
`${baseUrl}/api/issues/${task.id}`
|
||||
|
||||
@@ -23,7 +23,7 @@ export default defineConfig({
|
||||
// The webServer directive starts `paperclipai run` before tests.
|
||||
// Expects `pnpm paperclipai` to be runnable from repo root.
|
||||
webServer: {
|
||||
command: `pnpm paperclipai run --yes`,
|
||||
command: `pnpm paperclipai run`,
|
||||
url: `${BASE_URL}/api/health`,
|
||||
reuseExistingServer: !!process.env.CI,
|
||||
timeout: 120_000,
|
||||
|
||||
146
tests/release-smoke/docker-auth-onboarding.spec.ts
Normal file
146
tests/release-smoke/docker-auth-onboarding.spec.ts
Normal file
@@ -0,0 +1,146 @@
|
||||
import { expect, test, type Page } from "@playwright/test";
|
||||
|
||||
const ADMIN_EMAIL =
|
||||
process.env.PAPERCLIP_RELEASE_SMOKE_EMAIL ??
|
||||
process.env.SMOKE_ADMIN_EMAIL ??
|
||||
"smoke-admin@paperclip.local";
|
||||
const ADMIN_PASSWORD =
|
||||
process.env.PAPERCLIP_RELEASE_SMOKE_PASSWORD ??
|
||||
process.env.SMOKE_ADMIN_PASSWORD ??
|
||||
"paperclip-smoke-password";
|
||||
|
||||
const COMPANY_NAME = `Release-Smoke-${Date.now()}`;
|
||||
const AGENT_NAME = "CEO";
|
||||
const TASK_TITLE = "Release smoke task";
|
||||
|
||||
async function signIn(page: Page) {
|
||||
await page.goto("/");
|
||||
await expect(page).toHaveURL(/\/auth/);
|
||||
|
||||
await page.locator('input[type="email"]').fill(ADMIN_EMAIL);
|
||||
await page.locator('input[type="password"]').fill(ADMIN_PASSWORD);
|
||||
await page.getByRole("button", { name: "Sign In" }).click();
|
||||
|
||||
await expect(page).not.toHaveURL(/\/auth/, { timeout: 20_000 });
|
||||
}
|
||||
|
||||
async function openOnboarding(page: Page) {
|
||||
const wizardHeading = page.locator("h3", { hasText: "Name your company" });
|
||||
const startButton = page.getByRole("button", { name: "Start Onboarding" });
|
||||
|
||||
await expect(wizardHeading.or(startButton)).toBeVisible({ timeout: 20_000 });
|
||||
|
||||
if (await startButton.isVisible()) {
|
||||
await startButton.click();
|
||||
}
|
||||
|
||||
await expect(wizardHeading).toBeVisible({ timeout: 10_000 });
|
||||
}
|
||||
|
||||
test.describe("Docker authenticated onboarding smoke", () => {
|
||||
test("logs in, completes onboarding, and triggers the first CEO run", async ({
|
||||
page,
|
||||
}) => {
|
||||
await signIn(page);
|
||||
await openOnboarding(page);
|
||||
|
||||
await page.locator('input[placeholder="Acme Corp"]').fill(COMPANY_NAME);
|
||||
await page.getByRole("button", { name: "Next" }).click();
|
||||
|
||||
await expect(
|
||||
page.locator("h3", { hasText: "Create your first agent" })
|
||||
).toBeVisible({ timeout: 10_000 });
|
||||
|
||||
await expect(page.locator('input[placeholder="CEO"]')).toHaveValue(AGENT_NAME);
|
||||
await page.getByRole("button", { name: "Process" }).click();
|
||||
await page.locator('input[placeholder="e.g. node, python"]').fill("echo");
|
||||
await page
|
||||
.locator('input[placeholder="e.g. script.js, --flag"]')
|
||||
.fill("release smoke");
|
||||
await page.getByRole("button", { name: "Next" }).click();
|
||||
|
||||
await expect(
|
||||
page.locator("h3", { hasText: "Give it something to do" })
|
||||
).toBeVisible({ timeout: 10_000 });
|
||||
await page
|
||||
.locator('input[placeholder="e.g. Research competitor pricing"]')
|
||||
.fill(TASK_TITLE);
|
||||
await page.getByRole("button", { name: "Next" }).click();
|
||||
|
||||
await expect(
|
||||
page.locator("h3", { hasText: "Ready to launch" })
|
||||
).toBeVisible({ timeout: 10_000 });
|
||||
await expect(page.getByText(COMPANY_NAME)).toBeVisible();
|
||||
await expect(page.getByText(AGENT_NAME)).toBeVisible();
|
||||
await expect(page.getByText(TASK_TITLE)).toBeVisible();
|
||||
|
||||
await page.getByRole("button", { name: "Create & Open Issue" }).click();
|
||||
await expect(page).toHaveURL(/\/issues\//, { timeout: 10_000 });
|
||||
|
||||
const baseUrl = new URL(page.url()).origin;
|
||||
|
||||
const companiesRes = await page.request.get(`${baseUrl}/api/companies`);
|
||||
expect(companiesRes.ok()).toBe(true);
|
||||
const companies = (await companiesRes.json()) as Array<{ id: string; name: string }>;
|
||||
const company = companies.find((entry) => entry.name === COMPANY_NAME);
|
||||
expect(company).toBeTruthy();
|
||||
|
||||
const agentsRes = await page.request.get(
|
||||
`${baseUrl}/api/companies/${company!.id}/agents`
|
||||
);
|
||||
expect(agentsRes.ok()).toBe(true);
|
||||
const agents = (await agentsRes.json()) as Array<{
|
||||
id: string;
|
||||
name: string;
|
||||
role: string;
|
||||
adapterType: string;
|
||||
}>;
|
||||
const ceoAgent = agents.find((entry) => entry.name === AGENT_NAME);
|
||||
expect(ceoAgent).toBeTruthy();
|
||||
expect(ceoAgent!.role).toBe("ceo");
|
||||
expect(ceoAgent!.adapterType).toBe("process");
|
||||
|
||||
const issuesRes = await page.request.get(
|
||||
`${baseUrl}/api/companies/${company!.id}/issues`
|
||||
);
|
||||
expect(issuesRes.ok()).toBe(true);
|
||||
const issues = (await issuesRes.json()) as Array<{
|
||||
id: string;
|
||||
title: string;
|
||||
assigneeAgentId: string | null;
|
||||
}>;
|
||||
const issue = issues.find((entry) => entry.title === TASK_TITLE);
|
||||
expect(issue).toBeTruthy();
|
||||
expect(issue!.assigneeAgentId).toBe(ceoAgent!.id);
|
||||
|
||||
await expect.poll(
|
||||
async () => {
|
||||
const runsRes = await page.request.get(
|
||||
`${baseUrl}/api/companies/${company!.id}/heartbeat-runs?agentId=${ceoAgent!.id}`
|
||||
);
|
||||
expect(runsRes.ok()).toBe(true);
|
||||
const runs = (await runsRes.json()) as Array<{
|
||||
agentId: string;
|
||||
invocationSource: string;
|
||||
status: string;
|
||||
}>;
|
||||
const latestRun = runs.find((entry) => entry.agentId === ceoAgent!.id);
|
||||
return latestRun
|
||||
? {
|
||||
invocationSource: latestRun.invocationSource,
|
||||
status: latestRun.status,
|
||||
}
|
||||
: null;
|
||||
},
|
||||
{
|
||||
timeout: 30_000,
|
||||
intervals: [1_000, 2_000, 5_000],
|
||||
}
|
||||
).toEqual(
|
||||
expect.objectContaining({
|
||||
invocationSource: "assignment",
|
||||
status: expect.stringMatching(/^(queued|running|succeeded)$/),
|
||||
})
|
||||
);
|
||||
});
|
||||
});
|
||||
28
tests/release-smoke/playwright.config.ts
Normal file
28
tests/release-smoke/playwright.config.ts
Normal file
@@ -0,0 +1,28 @@
|
||||
import { defineConfig } from "@playwright/test";
|
||||
|
||||
const BASE_URL =
|
||||
process.env.PAPERCLIP_RELEASE_SMOKE_BASE_URL ?? "http://127.0.0.1:3232";
|
||||
|
||||
export default defineConfig({
|
||||
testDir: ".",
|
||||
testMatch: "**/*.spec.ts",
|
||||
timeout: 90_000,
|
||||
expect: {
|
||||
timeout: 15_000,
|
||||
},
|
||||
retries: process.env.CI ? 1 : 0,
|
||||
use: {
|
||||
baseURL: BASE_URL,
|
||||
headless: true,
|
||||
screenshot: "only-on-failure",
|
||||
trace: "retain-on-failure",
|
||||
},
|
||||
projects: [
|
||||
{
|
||||
name: "chromium",
|
||||
use: { browserName: "chromium" },
|
||||
},
|
||||
],
|
||||
outputDir: "./test-results",
|
||||
reporter: [["list"], ["html", { open: "never", outputFolder: "./playwright-report" }]],
|
||||
});
|
||||
@@ -1,4 +1,3 @@
|
||||
import { useEffect, useRef } from "react";
|
||||
import { Navigate, Outlet, Route, Routes, useLocation, useParams } from "@/lib/router";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { Button } from "@/components/ui/button";
|
||||
@@ -43,6 +42,7 @@ import { queryKeys } from "./lib/queryKeys";
|
||||
import { useCompany } from "./context/CompanyContext";
|
||||
import { useDialog } from "./context/DialogContext";
|
||||
import { loadLastInboxTab } from "./lib/inbox";
|
||||
import { shouldRedirectCompanylessRouteToOnboarding } from "./lib/onboarding-route";
|
||||
|
||||
function BootstrapPendingPage({ hasActiveInvite = false }: { hasActiveInvite?: boolean }) {
|
||||
return (
|
||||
@@ -181,24 +181,13 @@ function LegacySettingsRedirect() {
|
||||
}
|
||||
|
||||
function OnboardingRoutePage() {
|
||||
const { companies, loading } = useCompany();
|
||||
const { onboardingOpen, openOnboarding } = useDialog();
|
||||
const { companies } = useCompany();
|
||||
const { openOnboarding } = useDialog();
|
||||
const { companyPrefix } = useParams<{ companyPrefix?: string }>();
|
||||
const opened = useRef(false);
|
||||
const matchedCompany = companyPrefix
|
||||
? companies.find((company) => company.issuePrefix.toUpperCase() === companyPrefix.toUpperCase()) ?? null
|
||||
: null;
|
||||
|
||||
useEffect(() => {
|
||||
if (loading || opened.current || onboardingOpen) return;
|
||||
opened.current = true;
|
||||
if (matchedCompany) {
|
||||
openOnboarding({ initialStep: 2, companyId: matchedCompany.id });
|
||||
return;
|
||||
}
|
||||
openOnboarding();
|
||||
}, [companyPrefix, loading, matchedCompany, onboardingOpen, openOnboarding]);
|
||||
|
||||
const title = matchedCompany
|
||||
? `Add another agent to ${matchedCompany.name}`
|
||||
: companies.length > 0
|
||||
@@ -233,19 +222,22 @@ function OnboardingRoutePage() {
|
||||
|
||||
function CompanyRootRedirect() {
|
||||
const { companies, selectedCompany, loading } = useCompany();
|
||||
const { onboardingOpen } = useDialog();
|
||||
const location = useLocation();
|
||||
|
||||
if (loading) {
|
||||
return <div className="mx-auto max-w-xl py-10 text-sm text-muted-foreground">Loading...</div>;
|
||||
}
|
||||
|
||||
// Keep the first-run onboarding mounted until it completes.
|
||||
if (onboardingOpen) {
|
||||
return <NoCompaniesStartPage autoOpen={false} />;
|
||||
}
|
||||
|
||||
const targetCompany = selectedCompany ?? companies[0] ?? null;
|
||||
if (!targetCompany) {
|
||||
if (
|
||||
shouldRedirectCompanylessRouteToOnboarding({
|
||||
pathname: location.pathname,
|
||||
hasCompanies: false,
|
||||
})
|
||||
) {
|
||||
return <Navigate to="/onboarding" replace />;
|
||||
}
|
||||
return <NoCompaniesStartPage />;
|
||||
}
|
||||
|
||||
@@ -262,6 +254,14 @@ function UnprefixedBoardRedirect() {
|
||||
|
||||
const targetCompany = selectedCompany ?? companies[0] ?? null;
|
||||
if (!targetCompany) {
|
||||
if (
|
||||
shouldRedirectCompanylessRouteToOnboarding({
|
||||
pathname: location.pathname,
|
||||
hasCompanies: false,
|
||||
})
|
||||
) {
|
||||
return <Navigate to="/onboarding" replace />;
|
||||
}
|
||||
return <NoCompaniesStartPage />;
|
||||
}
|
||||
|
||||
@@ -273,16 +273,8 @@ function UnprefixedBoardRedirect() {
|
||||
);
|
||||
}
|
||||
|
||||
function NoCompaniesStartPage({ autoOpen = true }: { autoOpen?: boolean }) {
|
||||
function NoCompaniesStartPage() {
|
||||
const { openOnboarding } = useDialog();
|
||||
const opened = useRef(false);
|
||||
|
||||
useEffect(() => {
|
||||
if (!autoOpen) return;
|
||||
if (opened.current) return;
|
||||
opened.current = true;
|
||||
openOnboarding();
|
||||
}, [autoOpen, openOnboarding]);
|
||||
|
||||
return (
|
||||
<div className="mx-auto max-w-xl py-10">
|
||||
|
||||
@@ -1,11 +1,6 @@
|
||||
import { useState, useEffect, useRef, useMemo, useCallback } from "react";
|
||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import { AGENT_ADAPTER_TYPES } from "@paperclipai/shared";
|
||||
import {
|
||||
hasSessionCompactionThresholds,
|
||||
resolveSessionCompactionPolicy,
|
||||
type ResolvedSessionCompactionPolicy,
|
||||
} from "@paperclipai/adapter-utils";
|
||||
import type {
|
||||
Agent,
|
||||
AdapterEnvironmentTestResult,
|
||||
@@ -420,12 +415,6 @@ export function AgentConfigForm(props: AgentConfigFormProps) {
|
||||
heartbeat: mergedHeartbeat,
|
||||
};
|
||||
}, [isCreate, overlay.heartbeat, runtimeConfig, val]);
|
||||
const sessionCompaction = useMemo(
|
||||
() => resolveSessionCompactionPolicy(adapterType, effectiveRuntimeConfig),
|
||||
[adapterType, effectiveRuntimeConfig],
|
||||
);
|
||||
const showSessionCompactionCard = Boolean(sessionCompaction.adapterSessionManagement);
|
||||
|
||||
return (
|
||||
<div className={cn("relative", cards && "space-y-6")}>
|
||||
{/* ---- Floating Save button (edit mode, when dirty) ---- */}
|
||||
@@ -735,6 +724,32 @@ export function AgentConfigForm(props: AgentConfigFormProps) {
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
{!isCreate && typeof config.bootstrapPromptTemplate === "string" && config.bootstrapPromptTemplate && (
|
||||
<>
|
||||
<Field label="Bootstrap prompt (legacy)" hint={help.bootstrapPrompt}>
|
||||
<MarkdownEditor
|
||||
value={eff(
|
||||
"adapterConfig",
|
||||
"bootstrapPromptTemplate",
|
||||
String(config.bootstrapPromptTemplate ?? ""),
|
||||
)}
|
||||
onChange={(v) =>
|
||||
mark("adapterConfig", "bootstrapPromptTemplate", v || undefined)
|
||||
}
|
||||
placeholder="Optional initial setup prompt for the first run"
|
||||
contentClassName="min-h-[44px] text-sm font-mono"
|
||||
imageUploadHandler={async (file) => {
|
||||
const namespace = `agents/${props.agent.id}/bootstrap-prompt`;
|
||||
const asset = await uploadMarkdownImage.mutateAsync({ file, namespace });
|
||||
return asset.contentPath;
|
||||
}}
|
||||
/>
|
||||
</Field>
|
||||
<div className="rounded-md border border-amber-500/25 bg-amber-500/10 px-3 py-2 text-xs text-amber-200">
|
||||
Bootstrap prompt is legacy and will be removed in a future release. Consider moving this content into the agent's prompt template or instructions file instead.
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
{adapterType === "claude_local" && (
|
||||
<ClaudeLocalAdvancedFields {...adapterFieldProps} />
|
||||
)}
|
||||
@@ -831,12 +846,6 @@ export function AgentConfigForm(props: AgentConfigFormProps) {
|
||||
numberHint={help.intervalSec}
|
||||
showNumber={val!.heartbeatEnabled}
|
||||
/>
|
||||
{showSessionCompactionCard && (
|
||||
<SessionCompactionPolicyCard
|
||||
adapterType={adapterType}
|
||||
resolution={sessionCompaction}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
) : !isCreate ? (
|
||||
@@ -859,12 +868,6 @@ export function AgentConfigForm(props: AgentConfigFormProps) {
|
||||
numberHint={help.intervalSec}
|
||||
showNumber={eff("heartbeat", "enabled", heartbeat.enabled !== false)}
|
||||
/>
|
||||
{showSessionCompactionCard && (
|
||||
<SessionCompactionPolicyCard
|
||||
adapterType={adapterType}
|
||||
resolution={sessionCompaction}
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
<CollapsibleSection
|
||||
title="Advanced Run Policy"
|
||||
@@ -952,69 +955,6 @@ function AdapterEnvironmentResult({ result }: { result: AdapterEnvironmentTestRe
|
||||
);
|
||||
}
|
||||
|
||||
function formatSessionThreshold(value: number, suffix: string) {
|
||||
if (value <= 0) return "Off";
|
||||
return `${value.toLocaleString("en-US")} ${suffix}`;
|
||||
}
|
||||
|
||||
function SessionCompactionPolicyCard({
|
||||
adapterType,
|
||||
resolution,
|
||||
}: {
|
||||
adapterType: string;
|
||||
resolution: ResolvedSessionCompactionPolicy;
|
||||
}) {
|
||||
const { adapterSessionManagement, policy, source } = resolution;
|
||||
if (!adapterSessionManagement) return null;
|
||||
|
||||
const adapterLabel = adapterLabels[adapterType] ?? adapterType;
|
||||
const sourceLabel = source === "agent_override" ? "Agent override" : "Adapter default";
|
||||
const rotationDisabled = !policy.enabled || !hasSessionCompactionThresholds(policy);
|
||||
const nativeSummary =
|
||||
adapterSessionManagement.nativeContextManagement === "confirmed"
|
||||
? `${adapterLabel} is treated as natively managing long context, so Paperclip fresh-session rotation defaults to off.`
|
||||
: adapterSessionManagement.nativeContextManagement === "likely"
|
||||
? `${adapterLabel} likely manages long context itself, but Paperclip still keeps conservative rotation defaults for now.`
|
||||
: `${adapterLabel} does not have verified native compaction behavior, so Paperclip keeps conservative rotation defaults.`;
|
||||
|
||||
return (
|
||||
<div className="rounded-md border border-sky-500/25 bg-sky-500/10 px-3 py-3 space-y-2">
|
||||
<div className="flex items-center justify-between gap-3">
|
||||
<div className="text-xs font-medium text-sky-50">Session compaction</div>
|
||||
<span className="rounded-full border border-sky-400/30 px-2 py-0.5 text-[11px] text-sky-100">
|
||||
{sourceLabel}
|
||||
</span>
|
||||
</div>
|
||||
<p className="text-xs text-sky-100/90">
|
||||
{nativeSummary}
|
||||
</p>
|
||||
<p className="text-xs text-sky-100/80">
|
||||
{rotationDisabled
|
||||
? "No Paperclip-managed fresh-session thresholds are active for this adapter."
|
||||
: "Paperclip will start a fresh session when one of these thresholds is reached."}
|
||||
</p>
|
||||
<div className="grid grid-cols-3 gap-2 text-[11px] text-sky-100/85 tabular-nums">
|
||||
<div>
|
||||
<div className="text-sky-100/60">Runs</div>
|
||||
<div>{formatSessionThreshold(policy.maxSessionRuns, "runs")}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className="text-sky-100/60">Raw input</div>
|
||||
<div>{formatSessionThreshold(policy.maxRawInputTokens, "tokens")}</div>
|
||||
</div>
|
||||
<div>
|
||||
<div className="text-sky-100/60">Age</div>
|
||||
<div>{formatSessionThreshold(policy.maxSessionAgeHours, "hours")}</div>
|
||||
</div>
|
||||
</div>
|
||||
<p className="text-[11px] text-sky-100/75">
|
||||
A large cumulative raw token total does not mean the full session is resent on every heartbeat.
|
||||
{source === "agent_override" && " This agent has an explicit runtimeConfig session compaction override."}
|
||||
</p>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
/* ---- Internal sub-components ---- */
|
||||
|
||||
const ENABLED_ADAPTER_TYPES = new Set(["claude_local", "codex_local", "gemini_local", "opencode_local", "cursor"]);
|
||||
|
||||
@@ -32,6 +32,7 @@ import { queryKeys } from "../lib/queryKeys";
|
||||
import { cn } from "../lib/utils";
|
||||
import { NotFoundPage } from "../pages/NotFound";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Tooltip, TooltipTrigger, TooltipContent } from "@/components/ui/tooltip";
|
||||
|
||||
const INSTANCE_SETTINGS_MEMORY_KEY = "paperclip.lastInstanceSettingsPath";
|
||||
|
||||
@@ -298,7 +299,12 @@ export function Layout() {
|
||||
<span className="truncate">Documentation</span>
|
||||
</a>
|
||||
{health?.version && (
|
||||
<span className="px-2 text-xs text-muted-foreground shrink-0">v{health.version}</span>
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<span className="px-2 text-xs text-muted-foreground shrink-0 cursor-default">v</span>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent>v{health.version}</TooltipContent>
|
||||
</Tooltip>
|
||||
)}
|
||||
<Button variant="ghost" size="icon-sm" className="text-muted-foreground shrink-0" asChild>
|
||||
<Link
|
||||
@@ -351,7 +357,12 @@ export function Layout() {
|
||||
<span className="truncate">Documentation</span>
|
||||
</a>
|
||||
{health?.version && (
|
||||
<span className="px-2 text-xs text-muted-foreground shrink-0">v{health.version}</span>
|
||||
<Tooltip>
|
||||
<TooltipTrigger asChild>
|
||||
<span className="px-2 text-xs text-muted-foreground shrink-0 cursor-default">v</span>
|
||||
</TooltipTrigger>
|
||||
<TooltipContent>v{health.version}</TooltipContent>
|
||||
</Tooltip>
|
||||
)}
|
||||
<Button variant="ghost" size="icon-sm" className="text-muted-foreground shrink-0" asChild>
|
||||
<Link
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import { useEffect, useState, useRef, useCallback, useMemo } from "react";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import type { AdapterEnvironmentTestResult } from "@paperclipai/shared";
|
||||
import { useLocation, useNavigate, useParams } from "@/lib/router";
|
||||
import { useDialog } from "../context/DialogContext";
|
||||
import { useCompany } from "../context/CompanyContext";
|
||||
import { companiesApi } from "../api/companies";
|
||||
@@ -30,6 +30,7 @@ import {
|
||||
} from "@paperclipai/adapter-codex-local";
|
||||
import { DEFAULT_CURSOR_LOCAL_MODEL } from "@paperclipai/adapter-cursor-local";
|
||||
import { DEFAULT_GEMINI_LOCAL_MODEL } from "@paperclipai/adapter-gemini-local";
|
||||
import { resolveRouteOnboardingOptions } from "../lib/onboarding-route";
|
||||
import { AsciiArtAnimation } from "./AsciiArtAnimation";
|
||||
import { ChoosePathButton } from "./PathInstructionsModal";
|
||||
import { HintIcon } from "./agent-config-primitives";
|
||||
@@ -75,12 +76,29 @@ After that, hire yourself a Founding Engineer agent and then plan the roadmap an
|
||||
|
||||
export function OnboardingWizard() {
|
||||
const { onboardingOpen, onboardingOptions, closeOnboarding } = useDialog();
|
||||
const { selectedCompanyId, companies, setSelectedCompanyId } = useCompany();
|
||||
const { companies, setSelectedCompanyId, loading: companiesLoading } = useCompany();
|
||||
const queryClient = useQueryClient();
|
||||
const navigate = useNavigate();
|
||||
const location = useLocation();
|
||||
const { companyPrefix } = useParams<{ companyPrefix?: string }>();
|
||||
const [routeDismissed, setRouteDismissed] = useState(false);
|
||||
|
||||
const initialStep = onboardingOptions.initialStep ?? 1;
|
||||
const existingCompanyId = onboardingOptions.companyId;
|
||||
const routeOnboardingOptions =
|
||||
companyPrefix && companiesLoading
|
||||
? null
|
||||
: resolveRouteOnboardingOptions({
|
||||
pathname: location.pathname,
|
||||
companyPrefix,
|
||||
companies,
|
||||
});
|
||||
const effectiveOnboardingOpen =
|
||||
onboardingOpen || (routeOnboardingOptions !== null && !routeDismissed);
|
||||
const effectiveOnboardingOptions = onboardingOpen
|
||||
? onboardingOptions
|
||||
: routeOnboardingOptions ?? {};
|
||||
|
||||
const initialStep = effectiveOnboardingOptions.initialStep ?? 1;
|
||||
const existingCompanyId = effectiveOnboardingOptions.companyId;
|
||||
|
||||
const [step, setStep] = useState<Step>(initialStep);
|
||||
const [loading, setLoading] = useState(false);
|
||||
@@ -134,27 +152,31 @@ export function OnboardingWizard() {
|
||||
const [createdAgentId, setCreatedAgentId] = useState<string | null>(null);
|
||||
const [createdIssueRef, setCreatedIssueRef] = useState<string | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
setRouteDismissed(false);
|
||||
}, [location.pathname]);
|
||||
|
||||
// Sync step and company when onboarding opens with options.
|
||||
// Keep this independent from company-list refreshes so Step 1 completion
|
||||
// doesn't get reset after creating a company.
|
||||
useEffect(() => {
|
||||
if (!onboardingOpen) return;
|
||||
const cId = onboardingOptions.companyId ?? null;
|
||||
setStep(onboardingOptions.initialStep ?? 1);
|
||||
if (!effectiveOnboardingOpen) return;
|
||||
const cId = effectiveOnboardingOptions.companyId ?? null;
|
||||
setStep(effectiveOnboardingOptions.initialStep ?? 1);
|
||||
setCreatedCompanyId(cId);
|
||||
setCreatedCompanyPrefix(null);
|
||||
}, [
|
||||
onboardingOpen,
|
||||
onboardingOptions.companyId,
|
||||
onboardingOptions.initialStep
|
||||
effectiveOnboardingOpen,
|
||||
effectiveOnboardingOptions.companyId,
|
||||
effectiveOnboardingOptions.initialStep
|
||||
]);
|
||||
|
||||
// Backfill issue prefix for an existing company once companies are loaded.
|
||||
useEffect(() => {
|
||||
if (!onboardingOpen || !createdCompanyId || createdCompanyPrefix) return;
|
||||
if (!effectiveOnboardingOpen || !createdCompanyId || createdCompanyPrefix) return;
|
||||
const company = companies.find((c) => c.id === createdCompanyId);
|
||||
if (company) setCreatedCompanyPrefix(company.issuePrefix);
|
||||
}, [onboardingOpen, createdCompanyId, createdCompanyPrefix, companies]);
|
||||
}, [effectiveOnboardingOpen, createdCompanyId, createdCompanyPrefix, companies]);
|
||||
|
||||
// Resize textarea when step 3 is shown or description changes
|
||||
useEffect(() => {
|
||||
@@ -171,7 +193,7 @@ export function OnboardingWizard() {
|
||||
? queryKeys.agents.adapterModels(createdCompanyId, adapterType)
|
||||
: ["agents", "none", "adapter-models", adapterType],
|
||||
queryFn: () => agentsApi.adapterModels(createdCompanyId!, adapterType),
|
||||
enabled: Boolean(createdCompanyId) && onboardingOpen && step === 2
|
||||
enabled: Boolean(createdCompanyId) && effectiveOnboardingOpen && step === 2
|
||||
});
|
||||
const isLocalAdapter =
|
||||
adapterType === "claude_local" ||
|
||||
@@ -546,13 +568,16 @@ export function OnboardingWizard() {
|
||||
}
|
||||
}
|
||||
|
||||
if (!onboardingOpen) return null;
|
||||
if (!effectiveOnboardingOpen) return null;
|
||||
|
||||
return (
|
||||
<Dialog
|
||||
open={onboardingOpen}
|
||||
open={effectiveOnboardingOpen}
|
||||
onOpenChange={(open) => {
|
||||
if (!open) handleClose();
|
||||
if (!open) {
|
||||
setRouteDismissed(true);
|
||||
handleClose();
|
||||
}
|
||||
}}
|
||||
>
|
||||
<DialogPortal>
|
||||
@@ -762,6 +787,12 @@ export function OnboardingWizard() {
|
||||
icon: Gem,
|
||||
desc: "Local Gemini agent"
|
||||
},
|
||||
{
|
||||
value: "process" as const,
|
||||
label: "Process",
|
||||
icon: Terminal,
|
||||
desc: "Run a local command"
|
||||
},
|
||||
{
|
||||
value: "opencode_local" as const,
|
||||
label: "OpenCode",
|
||||
|
||||
@@ -4,11 +4,14 @@ import { beforeEach, describe, expect, it } from "vitest";
|
||||
import type { Approval, DashboardSummary, HeartbeatRun, Issue, JoinRequest } from "@paperclipai/shared";
|
||||
import {
|
||||
computeInboxBadgeData,
|
||||
getApprovalsForTab,
|
||||
getInboxWorkItems,
|
||||
getRecentTouchedIssues,
|
||||
getUnreadTouchedIssues,
|
||||
loadLastInboxTab,
|
||||
RECENT_ISSUES_LIMIT,
|
||||
saveLastInboxTab,
|
||||
shouldShowInboxSection,
|
||||
} from "./inbox";
|
||||
|
||||
const storage = new Map<string, string>();
|
||||
@@ -46,6 +49,19 @@ function makeApproval(status: Approval["status"]): Approval {
|
||||
};
|
||||
}
|
||||
|
||||
function makeApprovalWithTimestamps(
|
||||
id: string,
|
||||
status: Approval["status"],
|
||||
updatedAt: string,
|
||||
): Approval {
|
||||
return {
|
||||
...makeApproval(status),
|
||||
id,
|
||||
createdAt: new Date(updatedAt),
|
||||
updatedAt: new Date(updatedAt),
|
||||
};
|
||||
}
|
||||
|
||||
function makeJoinRequest(id: string): JoinRequest {
|
||||
return {
|
||||
id,
|
||||
@@ -231,6 +247,77 @@ describe("inbox helpers", () => {
|
||||
expect(issues).toHaveLength(2);
|
||||
});
|
||||
|
||||
it("shows recent approvals in updated order and unread approvals as actionable only", () => {
|
||||
const approvals = [
|
||||
makeApprovalWithTimestamps("approval-approved", "approved", "2026-03-11T02:00:00.000Z"),
|
||||
makeApprovalWithTimestamps("approval-pending", "pending", "2026-03-11T01:00:00.000Z"),
|
||||
makeApprovalWithTimestamps(
|
||||
"approval-revision",
|
||||
"revision_requested",
|
||||
"2026-03-11T03:00:00.000Z",
|
||||
),
|
||||
];
|
||||
|
||||
expect(getApprovalsForTab(approvals, "recent", "all").map((approval) => approval.id)).toEqual([
|
||||
"approval-revision",
|
||||
"approval-approved",
|
||||
"approval-pending",
|
||||
]);
|
||||
expect(getApprovalsForTab(approvals, "unread", "all").map((approval) => approval.id)).toEqual([
|
||||
"approval-revision",
|
||||
"approval-pending",
|
||||
]);
|
||||
expect(getApprovalsForTab(approvals, "all", "resolved").map((approval) => approval.id)).toEqual([
|
||||
"approval-approved",
|
||||
]);
|
||||
});
|
||||
|
||||
it("mixes approvals into the inbox feed by most recent activity", () => {
|
||||
const newerIssue = makeIssue("1", true);
|
||||
newerIssue.lastExternalCommentAt = new Date("2026-03-11T04:00:00.000Z");
|
||||
|
||||
const olderIssue = makeIssue("2", false);
|
||||
olderIssue.lastExternalCommentAt = new Date("2026-03-11T02:00:00.000Z");
|
||||
|
||||
const approval = makeApprovalWithTimestamps(
|
||||
"approval-between",
|
||||
"pending",
|
||||
"2026-03-11T03:00:00.000Z",
|
||||
);
|
||||
|
||||
expect(
|
||||
getInboxWorkItems({
|
||||
issues: [olderIssue, newerIssue],
|
||||
approvals: [approval],
|
||||
}).map((item) => item.kind === "issue" ? `issue:${item.issue.id}` : `approval:${item.approval.id}`),
|
||||
).toEqual([
|
||||
"issue:1",
|
||||
"approval:approval-between",
|
||||
"issue:2",
|
||||
]);
|
||||
});
|
||||
|
||||
it("can include sections on recent without forcing them to be unread", () => {
|
||||
expect(
|
||||
shouldShowInboxSection({
|
||||
tab: "recent",
|
||||
hasItems: true,
|
||||
showOnRecent: true,
|
||||
showOnUnread: false,
|
||||
showOnAll: false,
|
||||
}),
|
||||
).toBe(true);
|
||||
expect(
|
||||
shouldShowInboxSection({
|
||||
tab: "unread",
|
||||
hasItems: true,
|
||||
showOnRecent: true,
|
||||
showOnUnread: false,
|
||||
showOnAll: false,
|
||||
}),
|
||||
).toBe(false);
|
||||
});
|
||||
|
||||
it("limits recent touched issues before unread badge counting", () => {
|
||||
const issues = Array.from({ length: RECENT_ISSUES_LIMIT + 5 }, (_, index) => {
|
||||
const issue = makeIssue(String(index + 1), index < 3);
|
||||
|
||||
@@ -12,6 +12,18 @@ export const ACTIONABLE_APPROVAL_STATUSES = new Set(["pending", "revision_reques
|
||||
export const DISMISSED_KEY = "paperclip:inbox:dismissed";
|
||||
export const INBOX_LAST_TAB_KEY = "paperclip:inbox:last-tab";
|
||||
export type InboxTab = "recent" | "unread" | "all";
|
||||
export type InboxApprovalFilter = "all" | "actionable" | "resolved";
|
||||
export type InboxWorkItem =
|
||||
| {
|
||||
kind: "issue";
|
||||
timestamp: number;
|
||||
issue: Issue;
|
||||
}
|
||||
| {
|
||||
kind: "approval";
|
||||
timestamp: number;
|
||||
approval: Approval;
|
||||
};
|
||||
|
||||
export interface InboxBadgeData {
|
||||
inbox: number;
|
||||
@@ -104,6 +116,85 @@ export function getUnreadTouchedIssues(issues: Issue[]): Issue[] {
|
||||
return issues.filter((issue) => issue.isUnreadForMe);
|
||||
}
|
||||
|
||||
export function getApprovalsForTab(
|
||||
approvals: Approval[],
|
||||
tab: InboxTab,
|
||||
filter: InboxApprovalFilter,
|
||||
): Approval[] {
|
||||
const sortedApprovals = [...approvals].sort(
|
||||
(a, b) => normalizeTimestamp(b.updatedAt) - normalizeTimestamp(a.updatedAt),
|
||||
);
|
||||
|
||||
if (tab === "recent") return sortedApprovals;
|
||||
if (tab === "unread") {
|
||||
return sortedApprovals.filter((approval) => ACTIONABLE_APPROVAL_STATUSES.has(approval.status));
|
||||
}
|
||||
if (filter === "all") return sortedApprovals;
|
||||
|
||||
return sortedApprovals.filter((approval) => {
|
||||
const isActionable = ACTIONABLE_APPROVAL_STATUSES.has(approval.status);
|
||||
return filter === "actionable" ? isActionable : !isActionable;
|
||||
});
|
||||
}
|
||||
|
||||
export function approvalActivityTimestamp(approval: Approval): number {
|
||||
const updatedAt = normalizeTimestamp(approval.updatedAt);
|
||||
if (updatedAt > 0) return updatedAt;
|
||||
return normalizeTimestamp(approval.createdAt);
|
||||
}
|
||||
|
||||
export function getInboxWorkItems({
|
||||
issues,
|
||||
approvals,
|
||||
}: {
|
||||
issues: Issue[];
|
||||
approvals: Approval[];
|
||||
}): InboxWorkItem[] {
|
||||
return [
|
||||
...issues.map((issue) => ({
|
||||
kind: "issue" as const,
|
||||
timestamp: issueLastActivityTimestamp(issue),
|
||||
issue,
|
||||
})),
|
||||
...approvals.map((approval) => ({
|
||||
kind: "approval" as const,
|
||||
timestamp: approvalActivityTimestamp(approval),
|
||||
approval,
|
||||
})),
|
||||
].sort((a, b) => {
|
||||
const timestampDiff = b.timestamp - a.timestamp;
|
||||
if (timestampDiff !== 0) return timestampDiff;
|
||||
|
||||
if (a.kind === "issue" && b.kind === "issue") {
|
||||
return sortIssuesByMostRecentActivity(a.issue, b.issue);
|
||||
}
|
||||
if (a.kind === "approval" && b.kind === "approval") {
|
||||
return approvalActivityTimestamp(b.approval) - approvalActivityTimestamp(a.approval);
|
||||
}
|
||||
|
||||
return a.kind === "approval" ? -1 : 1;
|
||||
});
|
||||
}
|
||||
|
||||
export function shouldShowInboxSection({
|
||||
tab,
|
||||
hasItems,
|
||||
showOnRecent,
|
||||
showOnUnread,
|
||||
showOnAll,
|
||||
}: {
|
||||
tab: InboxTab;
|
||||
hasItems: boolean;
|
||||
showOnRecent: boolean;
|
||||
showOnUnread: boolean;
|
||||
showOnAll: boolean;
|
||||
}): boolean {
|
||||
if (!hasItems) return false;
|
||||
if (tab === "recent") return showOnRecent;
|
||||
if (tab === "unread") return showOnUnread;
|
||||
return showOnAll;
|
||||
}
|
||||
|
||||
export function computeInboxBadgeData({
|
||||
approvals,
|
||||
joinRequests,
|
||||
|
||||
80
ui/src/lib/onboarding-route.test.ts
Normal file
80
ui/src/lib/onboarding-route.test.ts
Normal file
@@ -0,0 +1,80 @@
|
||||
import { describe, expect, it } from "vitest";
|
||||
import {
|
||||
isOnboardingPath,
|
||||
resolveRouteOnboardingOptions,
|
||||
shouldRedirectCompanylessRouteToOnboarding,
|
||||
} from "./onboarding-route";
|
||||
|
||||
describe("isOnboardingPath", () => {
|
||||
it("matches the global onboarding route", () => {
|
||||
expect(isOnboardingPath("/onboarding")).toBe(true);
|
||||
});
|
||||
|
||||
it("matches a company-prefixed onboarding route", () => {
|
||||
expect(isOnboardingPath("/pap/onboarding")).toBe(true);
|
||||
});
|
||||
|
||||
it("ignores non-onboarding routes", () => {
|
||||
expect(isOnboardingPath("/pap/dashboard")).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe("resolveRouteOnboardingOptions", () => {
|
||||
it("opens company creation for the global onboarding route", () => {
|
||||
expect(
|
||||
resolveRouteOnboardingOptions({
|
||||
pathname: "/onboarding",
|
||||
companies: [],
|
||||
}),
|
||||
).toEqual({ initialStep: 1 });
|
||||
});
|
||||
|
||||
it("opens agent creation when the prefixed company exists", () => {
|
||||
expect(
|
||||
resolveRouteOnboardingOptions({
|
||||
pathname: "/pap/onboarding",
|
||||
companyPrefix: "pap",
|
||||
companies: [{ id: "company-1", issuePrefix: "PAP" }],
|
||||
}),
|
||||
).toEqual({ initialStep: 2, companyId: "company-1" });
|
||||
});
|
||||
|
||||
it("falls back to company creation when the prefixed company is missing", () => {
|
||||
expect(
|
||||
resolveRouteOnboardingOptions({
|
||||
pathname: "/pap/onboarding",
|
||||
companyPrefix: "pap",
|
||||
companies: [],
|
||||
}),
|
||||
).toEqual({ initialStep: 1 });
|
||||
});
|
||||
});
|
||||
|
||||
describe("shouldRedirectCompanylessRouteToOnboarding", () => {
|
||||
it("redirects companyless entry routes into onboarding", () => {
|
||||
expect(
|
||||
shouldRedirectCompanylessRouteToOnboarding({
|
||||
pathname: "/",
|
||||
hasCompanies: false,
|
||||
}),
|
||||
).toBe(true);
|
||||
});
|
||||
|
||||
it("does not redirect when already on onboarding", () => {
|
||||
expect(
|
||||
shouldRedirectCompanylessRouteToOnboarding({
|
||||
pathname: "/onboarding",
|
||||
hasCompanies: false,
|
||||
}),
|
||||
).toBe(false);
|
||||
});
|
||||
|
||||
it("does not redirect when companies exist", () => {
|
||||
expect(
|
||||
shouldRedirectCompanylessRouteToOnboarding({
|
||||
pathname: "/issues",
|
||||
hasCompanies: true,
|
||||
}),
|
||||
).toBe(false);
|
||||
});
|
||||
});
|
||||
51
ui/src/lib/onboarding-route.ts
Normal file
51
ui/src/lib/onboarding-route.ts
Normal file
@@ -0,0 +1,51 @@
|
||||
type OnboardingRouteCompany = {
|
||||
id: string;
|
||||
issuePrefix: string;
|
||||
};
|
||||
|
||||
export function isOnboardingPath(pathname: string): boolean {
|
||||
const segments = pathname.split("/").filter(Boolean);
|
||||
|
||||
if (segments.length === 1) {
|
||||
return segments[0]?.toLowerCase() === "onboarding";
|
||||
}
|
||||
|
||||
if (segments.length === 2) {
|
||||
return segments[1]?.toLowerCase() === "onboarding";
|
||||
}
|
||||
|
||||
return false;
|
||||
}
|
||||
|
||||
export function resolveRouteOnboardingOptions(params: {
|
||||
pathname: string;
|
||||
companyPrefix?: string;
|
||||
companies: OnboardingRouteCompany[];
|
||||
}): { initialStep: 1 | 2; companyId?: string } | null {
|
||||
const { pathname, companyPrefix, companies } = params;
|
||||
|
||||
if (!isOnboardingPath(pathname)) return null;
|
||||
|
||||
if (!companyPrefix) {
|
||||
return { initialStep: 1 };
|
||||
}
|
||||
|
||||
const matchedCompany =
|
||||
companies.find(
|
||||
(company) =>
|
||||
company.issuePrefix.toUpperCase() === companyPrefix.toUpperCase(),
|
||||
) ?? null;
|
||||
|
||||
if (!matchedCompany) {
|
||||
return { initialStep: 1 };
|
||||
}
|
||||
|
||||
return { initialStep: 2, companyId: matchedCompany.id };
|
||||
}
|
||||
|
||||
export function shouldRedirectCompanylessRouteToOnboarding(params: {
|
||||
pathname: string;
|
||||
hasCompanies: boolean;
|
||||
}): boolean {
|
||||
return !params.hasCompanies && !isOnboardingPath(params.pathname);
|
||||
}
|
||||
@@ -731,8 +731,8 @@ export function AgentDetail() {
|
||||
crumbs.push({ label: "Instructions" });
|
||||
} else if (activeView === "configuration") {
|
||||
crumbs.push({ label: "Configuration" });
|
||||
} else if (activeView === "skills") {
|
||||
crumbs.push({ label: "Skills" });
|
||||
// } else if (activeView === "skills") { // TODO: bring back later
|
||||
// crumbs.push({ label: "Skills" });
|
||||
} else if (activeView === "runs") {
|
||||
crumbs.push({ label: "Runs" });
|
||||
} else if (activeView === "budget") {
|
||||
@@ -892,8 +892,9 @@ export function AgentDetail() {
|
||||
items={[
|
||||
{ value: "dashboard", label: "Dashboard" },
|
||||
{ value: "instructions", label: "Instructions" },
|
||||
{ value: "skills", label: "Skills" },
|
||||
{ value: "configuration", label: "Configuration" },
|
||||
{ value: "skills", label: "Skills" },
|
||||
{ value: "runs", label: "Runs" },
|
||||
{ value: "budget", label: "Budget" },
|
||||
]}
|
||||
value={activeView}
|
||||
|
||||
@@ -14,11 +14,11 @@ import { queryKeys } from "../lib/queryKeys";
|
||||
import { createIssueDetailLocationState } from "../lib/issueDetailBreadcrumb";
|
||||
import { EmptyState } from "../components/EmptyState";
|
||||
import { PageSkeleton } from "../components/PageSkeleton";
|
||||
import { ApprovalCard } from "../components/ApprovalCard";
|
||||
import { IssueRow } from "../components/IssueRow";
|
||||
import { PriorityIcon } from "../components/PriorityIcon";
|
||||
import { StatusIcon } from "../components/StatusIcon";
|
||||
import { StatusBadge } from "../components/StatusBadge";
|
||||
import { defaultTypeIcon, typeIcon, typeLabel } from "../components/ApprovalPayload";
|
||||
import { timeAgo } from "../lib/timeAgo";
|
||||
import { Button } from "@/components/ui/button";
|
||||
import { Separator } from "@/components/ui/separator";
|
||||
@@ -40,13 +40,17 @@ import {
|
||||
} from "lucide-react";
|
||||
import { Identity } from "../components/Identity";
|
||||
import { PageTabBar } from "../components/PageTabBar";
|
||||
import type { HeartbeatRun, Issue, JoinRequest } from "@paperclipai/shared";
|
||||
import type { Approval, HeartbeatRun, Issue, JoinRequest } from "@paperclipai/shared";
|
||||
import {
|
||||
ACTIONABLE_APPROVAL_STATUSES,
|
||||
getApprovalsForTab,
|
||||
getInboxWorkItems,
|
||||
getLatestFailedRunsByAgent,
|
||||
getRecentTouchedIssues,
|
||||
type InboxTab,
|
||||
InboxApprovalFilter,
|
||||
saveLastInboxTab,
|
||||
shouldShowInboxSection,
|
||||
type InboxTab,
|
||||
} from "../lib/inbox";
|
||||
import { useDismissedInboxItems } from "../hooks/useInboxBadge";
|
||||
|
||||
@@ -57,11 +61,9 @@ type InboxCategoryFilter =
|
||||
| "approvals"
|
||||
| "failed_runs"
|
||||
| "alerts";
|
||||
type InboxApprovalFilter = "all" | "actionable" | "resolved";
|
||||
type SectionKey =
|
||||
| "issues_i_touched"
|
||||
| "work_items"
|
||||
| "join_requests"
|
||||
| "approvals"
|
||||
| "failed_runs"
|
||||
| "alerts";
|
||||
|
||||
@@ -82,6 +84,10 @@ function runFailureMessage(run: HeartbeatRun): string {
|
||||
return firstNonEmptyLine(run.error) ?? firstNonEmptyLine(run.stderrExcerpt) ?? "Run exited with an error.";
|
||||
}
|
||||
|
||||
function approvalStatusLabel(status: Approval["status"]): string {
|
||||
return status.replaceAll("_", " ");
|
||||
}
|
||||
|
||||
function readIssueIdFromRun(run: HeartbeatRun): string | null {
|
||||
const context = run.contextSnapshot;
|
||||
if (!context) return null;
|
||||
@@ -233,6 +239,95 @@ function FailedRunCard({
|
||||
);
|
||||
}
|
||||
|
||||
function ApprovalInboxRow({
|
||||
approval,
|
||||
requesterName,
|
||||
onApprove,
|
||||
onReject,
|
||||
isPending,
|
||||
}: {
|
||||
approval: Approval;
|
||||
requesterName: string | null;
|
||||
onApprove: () => void;
|
||||
onReject: () => void;
|
||||
isPending: boolean;
|
||||
}) {
|
||||
const Icon = typeIcon[approval.type] ?? defaultTypeIcon;
|
||||
const label = typeLabel[approval.type] ?? approval.type;
|
||||
const showResolutionButtons =
|
||||
approval.type !== "budget_override_required" &&
|
||||
ACTIONABLE_APPROVAL_STATUSES.has(approval.status);
|
||||
|
||||
return (
|
||||
<div className="border-b border-border px-2 py-2.5 last:border-b-0 sm:px-1 sm:pr-3 sm:py-2">
|
||||
<div className="flex items-start gap-2 sm:items-center">
|
||||
<Link
|
||||
to={`/approvals/${approval.id}`}
|
||||
className="flex min-w-0 flex-1 items-start gap-2 no-underline text-inherit transition-colors hover:bg-accent/50"
|
||||
>
|
||||
<span className="hidden h-2 w-2 shrink-0 sm:inline-flex" aria-hidden="true" />
|
||||
<span className="hidden h-3.5 w-3.5 shrink-0 sm:inline-flex" aria-hidden="true" />
|
||||
<span className="mt-0.5 shrink-0 rounded-md bg-muted p-1.5 sm:mt-0">
|
||||
<Icon className="h-4 w-4 text-muted-foreground" />
|
||||
</span>
|
||||
<span className="min-w-0 flex-1">
|
||||
<span className="line-clamp-2 text-sm font-medium sm:truncate sm:line-clamp-none">
|
||||
{label}
|
||||
</span>
|
||||
<span className="mt-1 flex flex-wrap items-center gap-x-2 gap-y-1 text-xs text-muted-foreground">
|
||||
<span className="capitalize">{approvalStatusLabel(approval.status)}</span>
|
||||
{requesterName ? <span>requested by {requesterName}</span> : null}
|
||||
<span>updated {timeAgo(approval.updatedAt)}</span>
|
||||
</span>
|
||||
</span>
|
||||
</Link>
|
||||
{showResolutionButtons ? (
|
||||
<div className="hidden shrink-0 items-center gap-2 sm:flex">
|
||||
<Button
|
||||
size="sm"
|
||||
className="h-8 bg-green-700 px-3 text-white hover:bg-green-600"
|
||||
onClick={onApprove}
|
||||
disabled={isPending}
|
||||
>
|
||||
Approve
|
||||
</Button>
|
||||
<Button
|
||||
variant="destructive"
|
||||
size="sm"
|
||||
className="h-8 px-3"
|
||||
onClick={onReject}
|
||||
disabled={isPending}
|
||||
>
|
||||
Reject
|
||||
</Button>
|
||||
</div>
|
||||
) : null}
|
||||
</div>
|
||||
{showResolutionButtons ? (
|
||||
<div className="mt-3 flex gap-2 sm:hidden">
|
||||
<Button
|
||||
size="sm"
|
||||
className="h-8 bg-green-700 px-3 text-white hover:bg-green-600"
|
||||
onClick={onApprove}
|
||||
disabled={isPending}
|
||||
>
|
||||
Approve
|
||||
</Button>
|
||||
<Button
|
||||
variant="destructive"
|
||||
size="sm"
|
||||
className="h-8 px-3"
|
||||
onClick={onReject}
|
||||
disabled={isPending}
|
||||
>
|
||||
Reject
|
||||
</Button>
|
||||
</div>
|
||||
) : null}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export function Inbox() {
|
||||
const { selectedCompanyId } = useCompany();
|
||||
const { setBreadcrumbs } = useBreadcrumbs();
|
||||
@@ -334,6 +429,10 @@ export function Inbox() {
|
||||
() => touchedIssues.filter((issue) => issue.isUnreadForMe),
|
||||
[touchedIssues],
|
||||
);
|
||||
const issuesToRender = useMemo(
|
||||
() => (tab === "unread" ? unreadTouchedIssues : touchedIssues),
|
||||
[tab, touchedIssues, unreadTouchedIssues],
|
||||
);
|
||||
|
||||
const agentById = useMemo(() => {
|
||||
const map = new Map<string, string>();
|
||||
@@ -361,28 +460,28 @@ export function Inbox() {
|
||||
return ids;
|
||||
}, [heartbeatRuns]);
|
||||
|
||||
const allApprovals = useMemo(
|
||||
const approvalsToRender = useMemo(
|
||||
() => getApprovalsForTab(approvals ?? [], tab, allApprovalFilter),
|
||||
[approvals, tab, allApprovalFilter],
|
||||
);
|
||||
const showJoinRequestsCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "join_requests";
|
||||
const showTouchedCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "issues_i_touched";
|
||||
const showApprovalsCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "approvals";
|
||||
const showFailedRunsCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "failed_runs";
|
||||
const showAlertsCategory = allCategoryFilter === "everything" || allCategoryFilter === "alerts";
|
||||
const workItemsToRender = useMemo(
|
||||
() =>
|
||||
[...(approvals ?? [])].sort(
|
||||
(a, b) => new Date(b.createdAt).getTime() - new Date(a.createdAt).getTime(),
|
||||
),
|
||||
[approvals],
|
||||
getInboxWorkItems({
|
||||
issues: tab === "all" && !showTouchedCategory ? [] : issuesToRender,
|
||||
approvals: tab === "all" && !showApprovalsCategory ? [] : approvalsToRender,
|
||||
}),
|
||||
[approvalsToRender, issuesToRender, showApprovalsCategory, showTouchedCategory, tab],
|
||||
);
|
||||
|
||||
const actionableApprovals = useMemo(
|
||||
() => allApprovals.filter((approval) => ACTIONABLE_APPROVAL_STATUSES.has(approval.status)),
|
||||
[allApprovals],
|
||||
);
|
||||
|
||||
const filteredAllApprovals = useMemo(() => {
|
||||
if (allApprovalFilter === "all") return allApprovals;
|
||||
|
||||
return allApprovals.filter((approval) => {
|
||||
const isActionable = ACTIONABLE_APPROVAL_STATUSES.has(approval.status);
|
||||
return allApprovalFilter === "actionable" ? isActionable : !isActionable;
|
||||
});
|
||||
}, [allApprovals, allApprovalFilter]);
|
||||
|
||||
const agentName = (id: string | null) => {
|
||||
if (!id) return null;
|
||||
return agentById.get(id) ?? null;
|
||||
@@ -505,39 +604,29 @@ export function Inbox() {
|
||||
!dismissed.has("alert:budget");
|
||||
const hasAlerts = showAggregateAgentError || showBudgetAlert;
|
||||
const hasJoinRequests = joinRequests.length > 0;
|
||||
const hasTouchedIssues = touchedIssues.length > 0;
|
||||
|
||||
const showJoinRequestsCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "join_requests";
|
||||
const showTouchedCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "issues_i_touched";
|
||||
const showApprovalsCategory = allCategoryFilter === "everything" || allCategoryFilter === "approvals";
|
||||
const showFailedRunsCategory =
|
||||
allCategoryFilter === "everything" || allCategoryFilter === "failed_runs";
|
||||
const showAlertsCategory = allCategoryFilter === "everything" || allCategoryFilter === "alerts";
|
||||
|
||||
const approvalsToRender = tab === "all" ? filteredAllApprovals : actionableApprovals;
|
||||
const showTouchedSection =
|
||||
tab === "all"
|
||||
? showTouchedCategory && hasTouchedIssues
|
||||
: tab === "unread"
|
||||
? unreadTouchedIssues.length > 0
|
||||
: hasTouchedIssues;
|
||||
const showWorkItemsSection = workItemsToRender.length > 0;
|
||||
const showJoinRequestsSection =
|
||||
tab === "all" ? showJoinRequestsCategory && hasJoinRequests : tab === "unread" && hasJoinRequests;
|
||||
const showApprovalsSection = tab === "all"
|
||||
? showApprovalsCategory && filteredAllApprovals.length > 0
|
||||
: actionableApprovals.length > 0;
|
||||
const showFailedRunsSection =
|
||||
tab === "all" ? showFailedRunsCategory && hasRunFailures : tab === "unread" && hasRunFailures;
|
||||
const showAlertsSection = tab === "all" ? showAlertsCategory && hasAlerts : tab === "unread" && hasAlerts;
|
||||
const showFailedRunsSection = shouldShowInboxSection({
|
||||
tab,
|
||||
hasItems: hasRunFailures,
|
||||
showOnRecent: hasRunFailures,
|
||||
showOnUnread: hasRunFailures,
|
||||
showOnAll: showFailedRunsCategory && hasRunFailures,
|
||||
});
|
||||
const showAlertsSection = shouldShowInboxSection({
|
||||
tab,
|
||||
hasItems: hasAlerts,
|
||||
showOnRecent: hasAlerts,
|
||||
showOnUnread: hasAlerts,
|
||||
showOnAll: showAlertsCategory && hasAlerts,
|
||||
});
|
||||
|
||||
const visibleSections = [
|
||||
showFailedRunsSection ? "failed_runs" : null,
|
||||
showAlertsSection ? "alerts" : null,
|
||||
showApprovalsSection ? "approvals" : null,
|
||||
showJoinRequestsSection ? "join_requests" : null,
|
||||
showTouchedSection ? "issues_i_touched" : null,
|
||||
showWorkItemsSection ? "work_items" : null,
|
||||
].filter((key): key is SectionKey => key !== null);
|
||||
|
||||
const allLoaded =
|
||||
@@ -643,29 +732,72 @@ export function Inbox() {
|
||||
/>
|
||||
)}
|
||||
|
||||
{showApprovalsSection && (
|
||||
{showWorkItemsSection && (
|
||||
<>
|
||||
{showSeparatorBefore("approvals") && <Separator />}
|
||||
{showSeparatorBefore("work_items") && <Separator />}
|
||||
<div>
|
||||
<h3 className="mb-3 text-sm font-semibold uppercase tracking-wide text-muted-foreground">
|
||||
{tab === "unread" ? "Approvals Needing Action" : "Approvals"}
|
||||
</h3>
|
||||
<div className="grid gap-3">
|
||||
{approvalsToRender.map((approval) => (
|
||||
<ApprovalCard
|
||||
key={approval.id}
|
||||
approval={approval}
|
||||
requesterAgent={
|
||||
approval.requestedByAgentId
|
||||
? (agents ?? []).find((a) => a.id === approval.requestedByAgentId) ?? null
|
||||
: null
|
||||
}
|
||||
onApprove={() => approveMutation.mutate(approval.id)}
|
||||
onReject={() => rejectMutation.mutate(approval.id)}
|
||||
detailLink={`/approvals/${approval.id}`}
|
||||
isPending={approveMutation.isPending || rejectMutation.isPending}
|
||||
/>
|
||||
))}
|
||||
<div className="overflow-hidden rounded-xl border border-border bg-card">
|
||||
{workItemsToRender.map((item) => {
|
||||
if (item.kind === "approval") {
|
||||
return (
|
||||
<ApprovalInboxRow
|
||||
key={`approval:${item.approval.id}`}
|
||||
approval={item.approval}
|
||||
requesterName={agentName(item.approval.requestedByAgentId)}
|
||||
onApprove={() => approveMutation.mutate(item.approval.id)}
|
||||
onReject={() => rejectMutation.mutate(item.approval.id)}
|
||||
isPending={approveMutation.isPending || rejectMutation.isPending}
|
||||
/>
|
||||
);
|
||||
}
|
||||
|
||||
const issue = item.issue;
|
||||
const isUnread = issue.isUnreadForMe && !fadingOutIssues.has(issue.id);
|
||||
const isFading = fadingOutIssues.has(issue.id);
|
||||
return (
|
||||
<IssueRow
|
||||
key={`issue:${issue.id}`}
|
||||
issue={issue}
|
||||
issueLinkState={issueLinkState}
|
||||
desktopMetaLeading={(
|
||||
<>
|
||||
<span className="hidden sm:inline-flex">
|
||||
<PriorityIcon priority={issue.priority} />
|
||||
</span>
|
||||
<span className="hidden shrink-0 sm:inline-flex">
|
||||
<StatusIcon status={issue.status} />
|
||||
</span>
|
||||
<span className="shrink-0 font-mono text-xs text-muted-foreground">
|
||||
{issue.identifier ?? issue.id.slice(0, 8)}
|
||||
</span>
|
||||
{liveIssueIds.has(issue.id) && (
|
||||
<span className="inline-flex items-center gap-1 rounded-full bg-blue-500/10 px-1.5 py-0.5 sm:gap-1.5 sm:px-2">
|
||||
<span className="relative flex h-2 w-2">
|
||||
<span className="absolute inline-flex h-full w-full animate-pulse rounded-full bg-blue-400 opacity-75" />
|
||||
<span className="relative inline-flex h-2 w-2 rounded-full bg-blue-500" />
|
||||
</span>
|
||||
<span className="hidden text-[11px] font-medium text-blue-600 dark:text-blue-400 sm:inline">
|
||||
Live
|
||||
</span>
|
||||
</span>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
mobileMeta={
|
||||
issue.lastExternalCommentAt
|
||||
? `commented ${timeAgo(issue.lastExternalCommentAt)}`
|
||||
: `updated ${timeAgo(issue.updatedAt)}`
|
||||
}
|
||||
unreadState={isUnread ? "visible" : isFading ? "fading" : "hidden"}
|
||||
onMarkRead={() => markReadMutation.mutate(issue.id)}
|
||||
trailingMeta={
|
||||
issue.lastExternalCommentAt
|
||||
? `commented ${timeAgo(issue.lastExternalCommentAt)}`
|
||||
: `updated ${timeAgo(issue.updatedAt)}`
|
||||
}
|
||||
/>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
@@ -806,62 +938,6 @@ export function Inbox() {
|
||||
</>
|
||||
)}
|
||||
|
||||
{showTouchedSection && (
|
||||
<>
|
||||
{showSeparatorBefore("issues_i_touched") && <Separator />}
|
||||
<div>
|
||||
<div>
|
||||
{(tab === "unread" ? unreadTouchedIssues : touchedIssues).map((issue) => {
|
||||
const isUnread = issue.isUnreadForMe && !fadingOutIssues.has(issue.id);
|
||||
const isFading = fadingOutIssues.has(issue.id);
|
||||
return (
|
||||
<IssueRow
|
||||
key={issue.id}
|
||||
issue={issue}
|
||||
issueLinkState={issueLinkState}
|
||||
desktopMetaLeading={(
|
||||
<>
|
||||
<span className="hidden sm:inline-flex">
|
||||
<PriorityIcon priority={issue.priority} />
|
||||
</span>
|
||||
<span className="hidden shrink-0 sm:inline-flex">
|
||||
<StatusIcon status={issue.status} />
|
||||
</span>
|
||||
<span className="shrink-0 font-mono text-xs text-muted-foreground">
|
||||
{issue.identifier ?? issue.id.slice(0, 8)}
|
||||
</span>
|
||||
{liveIssueIds.has(issue.id) && (
|
||||
<span className="inline-flex items-center gap-1 rounded-full bg-blue-500/10 px-1.5 py-0.5 sm:gap-1.5 sm:px-2">
|
||||
<span className="relative flex h-2 w-2">
|
||||
<span className="absolute inline-flex h-full w-full animate-pulse rounded-full bg-blue-400 opacity-75" />
|
||||
<span className="relative inline-flex h-2 w-2 rounded-full bg-blue-500" />
|
||||
</span>
|
||||
<span className="hidden text-[11px] font-medium text-blue-600 dark:text-blue-400 sm:inline">
|
||||
Live
|
||||
</span>
|
||||
</span>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
mobileMeta={
|
||||
issue.lastExternalCommentAt
|
||||
? `commented ${timeAgo(issue.lastExternalCommentAt)}`
|
||||
: `updated ${timeAgo(issue.updatedAt)}`
|
||||
}
|
||||
unreadState={isUnread ? "visible" : isFading ? "fading" : "hidden"}
|
||||
onMarkRead={() => markReadMutation.mutate(issue.id)}
|
||||
trailingMeta={
|
||||
issue.lastExternalCommentAt
|
||||
? `commented ${timeAgo(issue.lastExternalCommentAt)}`
|
||||
: `updated ${timeAgo(issue.updatedAt)}`
|
||||
}
|
||||
/>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
@@ -9,6 +9,7 @@ import { authApi } from "../api/auth";
|
||||
import { projectsApi } from "../api/projects";
|
||||
import { useCompany } from "../context/CompanyContext";
|
||||
import { usePanel } from "../context/PanelContext";
|
||||
import { useToast } from "../context/ToastContext";
|
||||
import { useBreadcrumbs } from "../context/BreadcrumbContext";
|
||||
import { queryKeys } from "../lib/queryKeys";
|
||||
import { readIssueDetailBreadcrumb } from "../lib/issueDetailBreadcrumb";
|
||||
@@ -36,8 +37,10 @@ import { ScrollArea } from "@/components/ui/scroll-area";
|
||||
import { Tabs, TabsContent, TabsList, TabsTrigger } from "@/components/ui/tabs";
|
||||
import {
|
||||
Activity as ActivityIcon,
|
||||
Check,
|
||||
ChevronDown,
|
||||
ChevronRight,
|
||||
Copy,
|
||||
EyeOff,
|
||||
Hexagon,
|
||||
ListTree,
|
||||
@@ -196,7 +199,9 @@ export function IssueDetail() {
|
||||
const queryClient = useQueryClient();
|
||||
const navigate = useNavigate();
|
||||
const location = useLocation();
|
||||
const { pushToast } = useToast();
|
||||
const [moreOpen, setMoreOpen] = useState(false);
|
||||
const [copied, setCopied] = useState(false);
|
||||
const [mobilePropsOpen, setMobilePropsOpen] = useState(false);
|
||||
const [detailTab, setDetailTab] = useState("comments");
|
||||
const [secondaryOpen, setSecondaryOpen] = useState({
|
||||
@@ -585,6 +590,22 @@ export function IssueDetail() {
|
||||
return () => closePanel();
|
||||
}, [issue]); // eslint-disable-line react-hooks/exhaustive-deps
|
||||
|
||||
const copyIssueToClipboard = async () => {
|
||||
if (!issue) return;
|
||||
const decodeEntities = (text: string) => {
|
||||
const el = document.createElement("textarea");
|
||||
el.innerHTML = text;
|
||||
return el.value;
|
||||
};
|
||||
const title = decodeEntities(issue.title);
|
||||
const body = decodeEntities(issue.description ?? "");
|
||||
const md = `# ${issue.identifier}: ${title}\n\n${body}`.trimEnd();
|
||||
await navigator.clipboard.writeText(md);
|
||||
setCopied(true);
|
||||
pushToast({ title: "Copied to clipboard", tone: "success" });
|
||||
setTimeout(() => setCopied(false), 2000);
|
||||
};
|
||||
|
||||
if (isLoading) return <p className="text-sm text-muted-foreground">Loading...</p>;
|
||||
if (error) return <p className="text-sm text-destructive">{error.message}</p>;
|
||||
if (!issue) return null;
|
||||
@@ -737,17 +758,34 @@ export function IssueDetail() {
|
||||
</div>
|
||||
)}
|
||||
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon-xs"
|
||||
className="ml-auto md:hidden shrink-0"
|
||||
onClick={() => setMobilePropsOpen(true)}
|
||||
title="Properties"
|
||||
>
|
||||
<SlidersHorizontal className="h-4 w-4" />
|
||||
</Button>
|
||||
<div className="ml-auto flex items-center gap-0.5 md:hidden shrink-0">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon-xs"
|
||||
onClick={copyIssueToClipboard}
|
||||
title="Copy issue as markdown"
|
||||
>
|
||||
{copied ? <Check className="h-4 w-4 text-green-500" /> : <Copy className="h-4 w-4" />}
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon-xs"
|
||||
onClick={() => setMobilePropsOpen(true)}
|
||||
title="Properties"
|
||||
>
|
||||
<SlidersHorizontal className="h-4 w-4" />
|
||||
</Button>
|
||||
</div>
|
||||
|
||||
<div className="hidden md:flex items-center md:ml-auto shrink-0">
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon-xs"
|
||||
onClick={copyIssueToClipboard}
|
||||
title="Copy issue as markdown"
|
||||
>
|
||||
{copied ? <Check className="h-4 w-4 text-green-500" /> : <Copy className="h-4 w-4" />}
|
||||
</Button>
|
||||
<Button
|
||||
variant="ghost"
|
||||
size="icon-xs"
|
||||
|
||||
Reference in New Issue
Block a user