Compare commits

...

440 Commits

Author SHA1 Message Date
Zoey
0da1cc184f fix #2847 2026-03-02 21:41:30 +01:00
Zoey
19c819fc95 fix #2843 by moving the advanced_config to the top of custom locations, but this could cause other isses 2026-03-02 21:41:30 +01:00
Zoey
8521cf19cc invert default of NGINX_TRUST_SECPR1 to true / add AUTH_REQUEST_ANUBIS_USE_CUSTOM_IMAGES env 2026-03-02 21:41:30 +01:00
Zoey
daed77142f fix some things there made async a few commits ago 2026-03-02 21:41:30 +01:00
Zoey
537ca98f8f improve php-fpm settings
Signed-off-by: Zoey <zoey@z0ey.de>
2026-03-02 18:42:15 +01:00
renovate[bot]
f1e95f7ba6 dep updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-03-02 18:42:15 +01:00
Zoey
842b7d9a72 do not build images for PRs
Signed-off-by: Zoey <zoey@z0ey.de>
2026-03-01 11:25:23 +01:00
Zoey
ca8f602466 merge upstream 2026-02-27 23:12:46 +01:00
Zoey
57605b455c Merge remote-tracking branch 'upstream/develop' into develop 2026-02-27 23:02:50 +01:00
Zoey
bd06d48f0b make more async 2026-02-27 22:57:26 +01:00
jc21
c1d09eaceb Merge pull request #5353 from NginxProxyManager/dependabot/npm_and_yarn/docs/rollup-4.59.0
Bump rollup from 4.24.0 to 4.59.0 in /docs
2026-02-27 10:34:54 +10:00
jc21
9c509f30de Merge pull request #5355 from NginxProxyManager/dependabot/npm_and_yarn/frontend/rollup-4.59.0
Bump rollup from 4.57.1 to 4.59.0 in /frontend
2026-02-27 10:19:55 +10:00
dependabot[bot]
c85b11ee33 Bump rollup from 4.57.1 to 4.59.0 in /frontend
Bumps [rollup](https://github.com/rollup/rollup) from 4.57.1 to 4.59.0.
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.57.1...v4.59.0)

---
updated-dependencies:
- dependency-name: rollup
  dependency-version: 4.59.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-26 22:11:06 +00:00
jc21
cd5ef390b9 Merge pull request #5350 from NginxProxyManager/dependabot/npm_and_yarn/backend/basic-ftp-5.2.0
Bump basic-ftp from 5.1.0 to 5.2.0 in /backend
2026-02-27 08:09:57 +10:00
dependabot[bot]
d49cab1c0e Bump rollup from 4.24.0 to 4.59.0 in /docs
Bumps [rollup](https://github.com/rollup/rollup) from 4.24.0 to 4.59.0.
- [Release notes](https://github.com/rollup/rollup/releases)
- [Changelog](https://github.com/rollup/rollup/blob/master/CHANGELOG.md)
- [Commits](https://github.com/rollup/rollup/compare/v4.24.0...v4.59.0)

---
updated-dependencies:
- dependency-name: rollup
  dependency-version: 4.59.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-26 11:05:43 +00:00
renovate[bot]
d2b446192f dep updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-26 09:17:55 +01:00
dependabot[bot]
33b1a993ec Bump basic-ftp from 5.1.0 to 5.2.0 in /backend
Bumps [basic-ftp](https://github.com/patrickjuchli/basic-ftp) from 5.1.0 to 5.2.0.
- [Release notes](https://github.com/patrickjuchli/basic-ftp/releases)
- [Changelog](https://github.com/patrickjuchli/basic-ftp/blob/master/CHANGELOG.md)
- [Commits](https://github.com/patrickjuchli/basic-ftp/compare/v5.1.0...v5.2.0)

---
updated-dependencies:
- dependency-name: basic-ftp
  dependency-version: 5.2.0
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-25 23:29:28 +00:00
Jamie Curnow
67d40e186f Attempt to fix #5335 by allowing resovler generation to be opt-out with a env var 2026-02-26 08:32:02 +10:00
jc21
52be66c43e Merge pull request #5346 from siimaarmaa/develop
Added Estonia langugae support.
2026-02-25 08:38:17 +10:00
jc21
ec46cabcd4 Merge pull request #5334 from bill-mahoney/fix/atomic-ipv6-config-write
Fix silent nginx config corruption in 50-ipv6.sh
2026-02-25 08:35:11 +10:00
jc21
a7a9cc3acb Merge pull request #5337 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-patch-updates-2e2637830d
Bump eslint from 10.0.0 to 10.0.1 in /test in the prod-patch-updates group
2026-02-25 08:31:03 +10:00
jc21
020b3ebb33 Merge pull request #5338 from NginxProxyManager/dependabot/npm_and_yarn/backend/dev-patch-updates-d8f01b0b39
Bump nodemon from 3.1.13 to 3.1.14 in /backend in the dev-patch-updates group
2026-02-25 08:30:51 +10:00
jc21
c1c4baf389 Merge pull request #5339 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-minor-updates-22dc529b9a
Bump eslint-plugin-cypress from 6.0.0 to 6.1.0 in /test in the prod-minor-updates group
2026-02-25 08:30:40 +10:00
jc21
672b5d6dd9 Merge pull request #5341 from NginxProxyManager/dependabot/npm_and_yarn/backend/prod-patch-updates-07cfa309fd
Bump mysql2 from 3.17.3 to 3.17.5 in /backend in the prod-patch-updates group
2026-02-25 08:30:00 +10:00
jc21
cd230b5878 Merge pull request #5342 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-minor-updates-a1de0f9639
Bump happy-dom from 20.6.3 to 20.7.0 in /frontend in the dev-minor-updates group
2026-02-25 08:29:49 +10:00
jc21
a8f35062af Merge pull request #5343 from NginxProxyManager/dependabot/npm_and_yarn/frontend/prod-patch-updates-bfb85ae48b
Bump country-flag-icons from 1.6.13 to 1.6.14 in /frontend in the prod-patch-updates group
2026-02-25 08:29:06 +10:00
Jamie Curnow
da5955412d Command to regenerate nginx configs 2026-02-25 08:13:38 +10:00
siimaarmaa
adb27fe67d Added Estonia langugae support. First Estonia lanuage update is HelpDocs. By Siim Aarmaa 2026-02-23 20:49:06 +02:00
dependabot[bot]
d874af8692 Bump country-flag-icons in /frontend in the prod-patch-updates group
Bumps the prod-patch-updates group in /frontend with 1 update: [country-flag-icons](https://gitlab.com/catamphetamine/country-flag-icons).


Updates `country-flag-icons` from 1.6.13 to 1.6.14
- [Changelog](https://gitlab.com/catamphetamine/country-flag-icons/blob/master/CHANGELOG.md)
- [Commits](https://gitlab.com/catamphetamine/country-flag-icons/compare/v1.6.13...v1.6.14)

---
updated-dependencies:
- dependency-name: country-flag-icons
  dependency-version: 1.6.14
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-23 13:51:21 +00:00
dependabot[bot]
0844dade98 Bump happy-dom in /frontend in the dev-minor-updates group
Bumps the dev-minor-updates group in /frontend with 1 update: [happy-dom](https://github.com/capricorn86/happy-dom).


Updates `happy-dom` from 20.6.3 to 20.7.0
- [Release notes](https://github.com/capricorn86/happy-dom/releases)
- [Commits](https://github.com/capricorn86/happy-dom/compare/v20.6.3...v20.7.0)

---
updated-dependencies:
- dependency-name: happy-dom
  dependency-version: 20.7.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-23 13:50:56 +00:00
dependabot[bot]
71d59516e8 Bump mysql2 in /backend in the prod-patch-updates group
Bumps the prod-patch-updates group in /backend with 1 update: [mysql2](https://github.com/sidorares/node-mysql2).


Updates `mysql2` from 3.17.3 to 3.17.5
- [Release notes](https://github.com/sidorares/node-mysql2/releases)
- [Changelog](https://github.com/sidorares/node-mysql2/blob/master/Changelog.md)
- [Commits](https://github.com/sidorares/node-mysql2/compare/v3.17.3...v3.17.5)

---
updated-dependencies:
- dependency-name: mysql2
  dependency-version: 3.17.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-23 13:50:35 +00:00
dependabot[bot]
06e220e184 Bump nodemon in /backend in the dev-patch-updates group
Bumps the dev-patch-updates group in /backend with 1 update: [nodemon](https://github.com/remy/nodemon).


Updates `nodemon` from 3.1.13 to 3.1.14
- [Release notes](https://github.com/remy/nodemon/releases)
- [Commits](https://github.com/remy/nodemon/compare/v3.1.13...v3.1.14)

---
updated-dependencies:
- dependency-name: nodemon
  dependency-version: 3.1.14
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-23 13:50:13 +00:00
dependabot[bot]
dc53647e76 Bump eslint-plugin-cypress in /test in the prod-minor-updates group
Bumps the prod-minor-updates group in /test with 1 update: [eslint-plugin-cypress](https://github.com/cypress-io/eslint-plugin-cypress).


Updates `eslint-plugin-cypress` from 6.0.0 to 6.1.0
- [Release notes](https://github.com/cypress-io/eslint-plugin-cypress/releases)
- [Commits](https://github.com/cypress-io/eslint-plugin-cypress/compare/v6.0.0...v6.1.0)

---
updated-dependencies:
- dependency-name: eslint-plugin-cypress
  dependency-version: 6.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-23 13:50:13 +00:00
dependabot[bot]
4c04e89483 Bump eslint in /test in the prod-patch-updates group
Bumps the prod-patch-updates group in /test with 1 update: [eslint](https://github.com/eslint/eslint).


Updates `eslint` from 10.0.0 to 10.0.1
- [Release notes](https://github.com/eslint/eslint/releases)
- [Commits](https://github.com/eslint/eslint/compare/v10.0.0...v10.0.1)

---
updated-dependencies:
- dependency-name: eslint
  dependency-version: 10.0.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-23 13:50:02 +00:00
Zoey
8c3e0f2809 use node:crypto instead of openssl/use dayjs instead of moment 2026-02-22 18:45:28 +01:00
Zoey
074a01546a add docs route 2026-02-22 18:04:58 +01:00
renovate[bot]
246d31c2fd dep updates 2026-02-22 18:04:58 +01:00
William Mahoney
7241869a9e Fix silent config corruption in 50-ipv6.sh on NFS volumes
Replace unsafe `echo "$(sed ...)" > $FILE` with atomic temp-file write.

The current pattern reads a file with sed inside a command substitution,
then writes the result back via echo redirection. If sed reads an empty
or momentarily unreadable file (e.g., NFS transient issue during
container recreation by Watchtower or similar tools), it produces no
output. The echo then writes exactly 1 byte (a newline) to the config
file, silently destroying its contents.

The fix writes sed output to a temp file first, checks it's non-empty
with `[ -s ]`, then atomically replaces the original via `mv`. If sed
produces empty output, the original file is preserved and a warning is
logged to stderr.
2026-02-20 21:24:40 -07:00
Zoey
0333ab08f3 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-20 17:47:45 +01:00
Brian Norman
dda1c5ebe0 Adding additional instructions to help those using open-appsec cloud and crowdsec as it took me hours to work out why crowdsec was not seeing the events (#2790)
Signed-off-by: Brian Norman <703352+gingemonster@users.noreply.github.com>
Signed-off-by: Zoey <zoey@z0ey.de>
Co-authored-by: Zoey <zoey@z0ey.de>
2026-02-20 17:46:52 +01:00
Zoey
951062a6b9 switch to aws-lc/add patches for zlib-ng and brotli cert compression 2026-02-20 17:41:02 +01:00
jc21
94f6191a21 Merge pull request #5332 from NginxProxyManager/update-deps
Update deps
2026-02-20 11:54:46 +10:00
Jamie Curnow
cac52dd0ff Update linked deps 2026-02-20 11:20:59 +10:00
dependabot[bot]
906f177960 Bump tar from 7.5.7 to 7.5.9 in /test
Bumps [tar](https://github.com/isaacs/node-tar) from 7.5.7 to 7.5.9.
- [Release notes](https://github.com/isaacs/node-tar/releases)
- [Changelog](https://github.com/isaacs/node-tar/blob/main/CHANGELOG.md)
- [Commits](https://github.com/isaacs/node-tar/compare/v7.5.7...v7.5.9)

---
updated-dependencies:
- dependency-name: tar
  dependency-version: 7.5.9
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-20 11:20:59 +10:00
dependabot[bot]
f52afced5d Bump systeminformation from 5.30.6 to 5.31.1 in /test
Bumps [systeminformation](https://github.com/sebhildebrandt/systeminformation) from 5.30.6 to 5.31.1.
- [Release notes](https://github.com/sebhildebrandt/systeminformation/releases)
- [Changelog](https://github.com/sebhildebrandt/systeminformation/blob/master/CHANGELOG.md)
- [Commits](https://github.com/sebhildebrandt/systeminformation/compare/v5.30.6...v5.31.1)

---
updated-dependencies:
- dependency-name: systeminformation
  dependency-version: 5.31.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-20 11:20:59 +10:00
Jamie Curnow
e8224ff0af Update all dependencies 2026-02-20 11:02:56 +10:00
jc21
a4fa83d0ce Merge pull request #5326 from NginxProxyManager/dependabot/npm_and_yarn/test/tar-7.5.9
Bump tar from 7.5.7 to 7.5.9 in /test
2026-02-20 10:53:03 +10:00
jc21
770716ebf8 Merge pull request #5327 from NginxProxyManager/dependabot/npm_and_yarn/test/systeminformation-5.31.1
Bump systeminformation from 5.30.6 to 5.31.1 in /test
2026-02-20 10:52:51 +10:00
Zoey
17c2a68ff0 fix sec-fetch for oidc 2026-02-19 22:27:02 +01:00
Zoey
d144f54a6c fix missing default / location if custom location / is disabled 2026-02-19 22:09:54 +01:00
renovate[bot]
27fe362854 dep updates 2026-02-19 21:41:30 +01:00
Zoey
507a71cf9b improve error logging of ratelimited requests 2026-02-19 21:23:57 +01:00
Zoey
1b713e3a88 patch openappsec attachment to use zlib-ng 2026-02-19 21:04:10 +01:00
Zoey
c0c4f748b2 many security improvements: rate limits, limit upload size, fix: disabling totp and recretaing backup codes now requires a valid code, dep updates 2026-02-19 19:11:52 +01:00
Zoey
ae13514410 fix ga-IE langname in selection/update and pin dep
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-19 18:33:40 +01:00
Zoey
814827be4e merge upstream 2026-02-18 23:39:34 +01:00
Zoey
30ad65c5a6 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-18 23:38:20 +01:00
dependabot[bot]
f1067d3308 Bump systeminformation from 5.30.6 to 5.31.1 in /test
Bumps [systeminformation](https://github.com/sebhildebrandt/systeminformation) from 5.30.6 to 5.31.1.
- [Release notes](https://github.com/sebhildebrandt/systeminformation/releases)
- [Changelog](https://github.com/sebhildebrandt/systeminformation/blob/master/CHANGELOG.md)
- [Commits](https://github.com/sebhildebrandt/systeminformation/compare/v5.30.6...v5.31.1)

---
updated-dependencies:
- dependency-name: systeminformation
  dependency-version: 5.31.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-18 22:37:39 +00:00
dependabot[bot]
85c1a935ea Bump tar from 7.5.7 to 7.5.9 in /test
Bumps [tar](https://github.com/isaacs/node-tar) from 7.5.7 to 7.5.9.
- [Release notes](https://github.com/isaacs/node-tar/releases)
- [Changelog](https://github.com/isaacs/node-tar/blob/main/CHANGELOG.md)
- [Commits](https://github.com/isaacs/node-tar/compare/v7.5.7...v7.5.9)

---
updated-dependencies:
- dependency-name: tar
  dependency-version: 7.5.9
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-18 22:37:36 +00:00
Jamie Curnow
51ef7f3b86 Docs update, use package version instead of latest, refer to better mariadb image 2026-02-19 08:36:41 +10:00
Zoey
8484c69af8 merge upstream 2026-02-18 23:36:21 +01:00
Zoey
316cdf3479 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-18 23:29:08 +01:00
Zoey
dffa4a9888 use zlib-ng instead of zlib/use quickjs-ng for njs/fix #2781/dep updates/ 2026-02-18 23:26:22 +01:00
jc21
846b94f7e8 Merge pull request #5324 from biodland/develop
chore: added Norwegian translation
2026-02-19 07:51:23 +10:00
Zoey
ace499a546 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-18 15:57:54 +01:00
Birger
19e24c7e7e Rename langNO import to langNo for consistency 2026-02-18 12:24:20 +01:00
Birger
c1bc471dac chore: added Norwegian translation, added missing references. 2026-02-18 12:23:08 +01:00
Birger
608dc0b6bf chore: added Norwegian translation 2026-02-18 11:31:42 +01:00
Jamie Curnow
0dbf268f37 Fix #5284 for older sqlite3 configurations 2026-02-18 08:32:17 +10:00
Zoey
f9a49092ba merge upstream 2026-02-17 07:57:18 +01:00
Zoey
23c49447ab Merge remote-tracking branch 'upstream/develop' into develop 2026-02-17 07:41:00 +01:00
Zoey
dde694b57d force https for the npmplus and goaccess ui
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-17 07:29:54 +01:00
Zoey
9c9f82dc26 dep and doc updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-17 07:29:54 +01:00
jc21
c7437ddf8f Merge branch 'master' into develop 2026-02-17 15:02:29 +10:00
jc21
627f43c729 Merge pull request #5314 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-minor-updates-d71d2fefd7
Bump the dev-minor-updates group in /frontend with 2 updates
2026-02-17 14:57:17 +10:00
jc21
fc4c5aac86 Merge pull request #5315 from NginxProxyManager/dependabot/npm_and_yarn/frontend/prod-patch-updates-95db6732c0
Bump the prod-patch-updates group in /frontend with 2 updates
2026-02-17 14:57:03 +10:00
jc21
aff390f35d Merge pull request #5317 from Tech-no-1/fix-custom-certificates
Fix uploading of custom certificates
2026-02-17 14:50:07 +10:00
Jamie Curnow
5f5a3870e4 Drop support for armv7 builds, bump version, update docs 2026-02-17 12:55:56 +10:00
Tech-no-1
40f363bd4f Fix uploading of custom certificates 2026-02-17 02:58:18 +01:00
dependabot[bot]
678fdd22c6 Bump the dev-minor-updates group in /frontend with 2 updates
Bumps the dev-minor-updates group in /frontend with 2 updates: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome) and [happy-dom](https://github.com/capricorn86/happy-dom).


Updates `@biomejs/biome` from 2.3.14 to 2.4.0
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.4.0/packages/@biomejs/biome)

Updates `happy-dom` from 20.5.3 to 20.6.1
- [Release notes](https://github.com/capricorn86/happy-dom/releases)
- [Commits](https://github.com/capricorn86/happy-dom/compare/v20.5.3...v20.6.1)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.4.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
- dependency-name: happy-dom
  dependency-version: 20.6.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-17 01:43:56 +00:00
dependabot[bot]
6c3cc83d66 Bump the prod-patch-updates group in /frontend with 2 updates
Bumps the prod-patch-updates group in /frontend with 2 updates: [@tanstack/react-query](https://github.com/TanStack/query/tree/HEAD/packages/react-query) and [country-flag-icons](https://gitlab.com/catamphetamine/country-flag-icons).


Updates `@tanstack/react-query` from 5.90.20 to 5.90.21
- [Release notes](https://github.com/TanStack/query/releases)
- [Changelog](https://github.com/TanStack/query/blob/main/packages/react-query/CHANGELOG.md)
- [Commits](https://github.com/TanStack/query/commits/@tanstack/react-query@5.90.21/packages/react-query)

Updates `country-flag-icons` from 1.6.12 to 1.6.13
- [Changelog](https://gitlab.com/catamphetamine/country-flag-icons/blob/master/CHANGELOG.md)
- [Commits](https://gitlab.com/catamphetamine/country-flag-icons/compare/v1.6.12...v1.6.13)

---
updated-dependencies:
- dependency-name: "@tanstack/react-query"
  dependency-version: 5.90.21
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: country-flag-icons
  dependency-version: 1.6.13
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-17 01:43:37 +00:00
jc21
5916fd5bee Merge pull request #5313 from NginxProxyManager/dependabot/npm_and_yarn/backend/prod-minor-updates-4d12c0f7cc
Bump the prod-minor-updates group in /backend with 3 updates
2026-02-17 11:42:02 +10:00
jc21
f105673904 Merge pull request #5312 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-patch-updates-38f6d3601d
Bump the dev-patch-updates group in /frontend with 3 updates
2026-02-17 11:41:44 +10:00
jc21
a37d0b88d6 Merge pull request #5308 from YTKme/ytkme/fix-sqlite-internal-error
Fix SQLite Internal Error
2026-02-17 11:41:31 +10:00
Jamie Curnow
43bc2a743e Add note to docs about retiring armv7 after June 2026 2026-02-17 11:38:17 +10:00
jc21
269545256a Merge pull request #5283 from broker-consulting/feat/add-czech-translation
Add Czech translation and related locale files
2026-02-17 11:06:40 +10:00
jc21
e5df45e9ef Merge pull request #5279 from dodog/develop
Update Slovak translation
2026-02-17 11:05:51 +10:00
dependabot[bot]
5601dd14fc Bump the prod-minor-updates group in /backend with 3 updates
Bumps the prod-minor-updates group in /backend with 3 updates: [ajv](https://github.com/ajv-validator/ajv), [mysql2](https://github.com/sidorares/node-mysql2) and [otplib](https://github.com/yeojz/otplib/tree/HEAD/packages/otplib).


Updates `ajv` from 8.17.1 to 8.18.0
- [Release notes](https://github.com/ajv-validator/ajv/releases)
- [Commits](https://github.com/ajv-validator/ajv/compare/v8.17.1...v8.18.0)

Updates `mysql2` from 3.16.3 to 3.17.1
- [Release notes](https://github.com/sidorares/node-mysql2/releases)
- [Changelog](https://github.com/sidorares/node-mysql2/blob/master/Changelog.md)
- [Commits](https://github.com/sidorares/node-mysql2/compare/v3.16.3...v3.17.1)

Updates `otplib` from 13.2.1 to 13.3.0
- [Release notes](https://github.com/yeojz/otplib/releases)
- [Changelog](https://github.com/yeojz/otplib/blob/main/release.config.json)
- [Commits](https://github.com/yeojz/otplib/commits/v13.3.0/packages/otplib)

---
updated-dependencies:
- dependency-name: ajv
  dependency-version: 8.18.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: mysql2
  dependency-version: 3.17.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: otplib
  dependency-version: 13.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-16 14:01:58 +00:00
dependabot[bot]
3e5655cfcd Bump the dev-patch-updates group in /frontend with 3 updates
Bumps the dev-patch-updates group in /frontend with 3 updates: [@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react), [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/tree/HEAD/packages/plugin-react) and [vite-tsconfig-paths](https://github.com/aleclarson/vite-tsconfig-paths).


Updates `@types/react` from 19.2.13 to 19.2.14
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react)

Updates `@vitejs/plugin-react` from 5.1.3 to 5.1.4
- [Release notes](https://github.com/vitejs/vite-plugin-react/releases)
- [Changelog](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite-plugin-react/commits/plugin-react@5.1.4/packages/plugin-react)

Updates `vite-tsconfig-paths` from 6.1.0 to 6.1.1
- [Release notes](https://github.com/aleclarson/vite-tsconfig-paths/releases)
- [Commits](https://github.com/aleclarson/vite-tsconfig-paths/compare/v6.1.0...v6.1.1)

---
updated-dependencies:
- dependency-name: "@types/react"
  dependency-version: 19.2.14
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: "@vitejs/plugin-react"
  dependency-version: 5.1.4
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: vite-tsconfig-paths
  dependency-version: 6.1.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-16 14:01:52 +00:00
Zoey
1e723b2f88 merge upstream 2026-02-16 09:35:32 +01:00
Zoey
27510c888d Merge remote-tracking branch 'upstream/develop' into develop 2026-02-16 09:35:25 +01:00
Zoey
7499388f49 re-add AES128-GCM-SHA256 cipher(suites) 2026-02-16 09:32:58 +01:00
Zoey
ed70405773 dep/doc updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-16 09:32:58 +01:00
jc21
a90af83270 Merge pull request #5309 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-patch-updates-d4d031af8e
Bump @quobix/vacuum from 0.23.5 to 0.23.8 in /test in the prod-patch-updates group across 1 directory
2026-02-16 11:57:14 +10:00
jc21
619a8e5acc Merge pull request #5310 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-minor-updates-aef0194d28
Bump eslint-plugin-cypress from 5.2.1 to 5.3.0 in /test in the prod-minor-updates group across 1 directory
2026-02-16 11:57:05 +10:00
jc21
6dcdefb57e Merge pull request #5294 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-patch-updates-cb582034f5
Bump the dev-patch-updates group in /frontend with 2 updates
2026-02-16 10:30:46 +10:00
jc21
787616010b Merge pull request #5289 from kiaxseventh/develop
Added ArvanCloud DNS plugin support via certbot-dns-arvan package
2026-02-16 10:30:34 +10:00
dependabot[bot]
5891c291d2 Bump eslint-plugin-cypress
Bumps the prod-minor-updates group with 1 update in the /test directory: [eslint-plugin-cypress](https://github.com/cypress-io/eslint-plugin-cypress).


Updates `eslint-plugin-cypress` from 5.2.1 to 5.3.0
- [Release notes](https://github.com/cypress-io/eslint-plugin-cypress/releases)
- [Commits](https://github.com/cypress-io/eslint-plugin-cypress/compare/v5.2.1...v5.3.0)

---
updated-dependencies:
- dependency-name: eslint-plugin-cypress
  dependency-version: 5.3.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-16 00:29:59 +00:00
dependabot[bot]
41a2a41e67 Bump @quobix/vacuum
Bumps the prod-patch-updates group with 1 update in the /test directory: [@quobix/vacuum](https://github.com/daveshanley/vacuum).


Updates `@quobix/vacuum` from 0.23.5 to 0.23.8
- [Release notes](https://github.com/daveshanley/vacuum/releases)
- [Commits](https://github.com/daveshanley/vacuum/compare/v0.23.5...v0.23.8)

---
updated-dependencies:
- dependency-name: "@quobix/vacuum"
  dependency-version: 0.23.8
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-16 00:29:36 +00:00
jc21
379099d7ed Merge pull request #5292 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-minor-updates-69046ce3b8
Bump cypress from 15.9.0 to 15.10.0 in /test in the prod-minor-updates group
2026-02-16 10:29:02 +10:00
jc21
dbeab93c02 Merge pull request #5293 from NginxProxyManager/dependabot/npm_and_yarn/test/eslint-10.0.0
Bump eslint from 9.39.2 to 10.0.0 in /test
2026-02-16 10:28:52 +10:00
jc21
010cb562a0 Merge pull request #5295 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-minor-updates-24fcbaaf54
Bump vite-tsconfig-paths from 6.0.5 to 6.1.0 in /frontend in the dev-minor-updates group
2026-02-16 10:28:39 +10:00
jc21
7ff2fc1900 Merge pull request #5299 from NginxProxyManager/dependabot/npm_and_yarn/test/axios-1.13.5
Bump axios from 1.13.4 to 1.13.5 in /test
2026-02-16 10:28:26 +10:00
jc21
1c189a1888 Merge pull request #5300 from NginxProxyManager/dependabot/npm_and_yarn/test/jsonpath-1.2.1
Bump jsonpath from 1.1.1 to 1.2.1 in /test
2026-02-16 10:28:16 +10:00
jc21
f3c46487f6 Merge pull request #5303 from 7heMech/fix-2fa-logout
Add guardrail to fix disabling 2fa
2026-02-16 10:27:58 +10:00
jc21
fcca481d1b Merge pull request #5305 from NginxProxyManager/dependabot/npm_and_yarn/backend/qs-6.14.2
Bump qs from 6.14.1 to 6.14.2 in /backend
2026-02-16 10:27:28 +10:00
jc21
c59c237000 Merge pull request #5306 from NginxProxyManager/dependabot/npm_and_yarn/test/qs-6.14.2
Bump qs from 6.14.1 to 6.14.2 in /test
2026-02-16 10:27:17 +10:00
Zoey
94a0e4a42f fix default of npmplus_x_frame_options in custom locations again 2026-02-15 10:09:23 +01:00
Zoey
8cd52e7f65 fix default of npmplus_upstream_compression/npmplus_x_frame_options in custom locations 2026-02-15 09:57:40 +01:00
Zoey
04b3c36f8f small ui improvements 2026-02-15 09:44:18 +01:00
Zoey
8af895cc67 fix custom locations being always marked as off 2026-02-15 09:35:43 +01:00
Yan Kuang
a62b6de9f2 Update SQLite client configuration from sqlite3 to better-sqlite3 2026-02-14 23:53:43 -08:00
Zoey
c2c33709d6 readd NGINX_WORKER_CONNECTIONS env/small fixes 2026-02-15 08:21:57 +01:00
Zoey
a2ba84ea6f prepare next release 2026-02-14 22:46:43 +01:00
Zoey
ea935ab578 split fancyindex/upstreamm compression button and add button to disable crowdsec appsec 2026-02-14 21:42:10 +01:00
Zoey
1025a4fcf3 add button to disable custom locations 2026-02-14 21:42:10 +01:00
Zoey
bdfc5a6086 remove NGINX_LOAD_GEOIP_MODULE (NOT geoip2) 2026-02-14 21:42:10 +01:00
Zoey
a03b9e008d update docs for new selections and csp
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-14 21:42:10 +01:00
Zoey
51a2f0549e hide http2 button in frontend and add http3 button 2026-02-14 21:42:10 +01:00
Zoey
559b5d2ab8 rename http3 column in backend 2026-02-14 21:42:10 +01:00
Zoey
50f898f805 invert SKIP_IP_RANGES by renaming it to TRUST_CLOUDFLARE 2026-02-14 17:51:15 +01:00
Zoey
3cdfb6d08d validate AUTH_REQUEST_ envs/fix proxying to sub paths
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-14 17:51:15 +01:00
Zoey
a0f8078dae add authentik-send-basic-auth 2026-02-14 17:51:15 +01:00
Zoey
10db251d49 use execFileSync in vite config
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-14 17:51:15 +01:00
Zoey
ac6d62aa4d fix csp 2026-02-14 17:51:15 +01:00
Zoey
d43a4f8fc2 only send X-Original-URL/X-Original-Method if needed 2026-02-14 17:51:15 +01:00
renovate[bot]
b9191f296f dep updates 2026-02-14 17:51:15 +01:00
dependabot[bot]
d92cc953e1 Bump qs from 6.14.1 to 6.14.2 in /test
Bumps [qs](https://github.com/ljharb/qs) from 6.14.1 to 6.14.2.
- [Changelog](https://github.com/ljharb/qs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/ljharb/qs/compare/v6.14.1...v6.14.2)

---
updated-dependencies:
- dependency-name: qs
  dependency-version: 6.14.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-14 12:44:07 +00:00
dependabot[bot]
1b6412688b Bump qs from 6.14.1 to 6.14.2 in /backend
Bumps [qs](https://github.com/ljharb/qs) from 6.14.1 to 6.14.2.
- [Changelog](https://github.com/ljharb/qs/blob/main/CHANGELOG.md)
- [Commits](https://github.com/ljharb/qs/compare/v6.14.1...v6.14.2)

---
updated-dependencies:
- dependency-name: qs
  dependency-version: 6.14.2
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-14 12:44:05 +00:00
7heMech
1d14f72ba5 Add guardrail for disable 2fa 2026-02-14 06:28:59 +00:00
dependabot[bot]
099243aff7 Bump jsonpath from 1.1.1 to 1.2.1 in /test
Bumps [jsonpath](https://github.com/dchester/jsonpath) from 1.1.1 to 1.2.1.
- [Commits](https://github.com/dchester/jsonpath/commits/1.2.1)

---
updated-dependencies:
- dependency-name: jsonpath
  dependency-version: 1.2.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-12 15:31:26 +00:00
Zoey
27863855f9 fix: slow page loading with basic auth and only saved hashed passwords in the db 2026-02-12 00:44:45 +01:00
dependabot[bot]
5fe12f69ba Bump axios from 1.13.4 to 1.13.5 in /test
Bumps [axios](https://github.com/axios/axios) from 1.13.4 to 1.13.5.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.13.4...v1.13.5)

---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.13.5
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-11 20:22:09 +00:00
Zoey
c536f98483 fix #2740 2026-02-11 19:09:04 +01:00
Zoey
6d7f4a8b74 merge upstream 2026-02-11 19:09:04 +01:00
Zoey
f6dc10bf54 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-11 18:55:21 +01:00
Zoey
644e3de10e add CSP drafts to NPMplus UI and goaccess 2026-02-11 18:04:38 +01:00
renovate[bot]
2630a628d4 dep updates 2026-02-11 18:04:38 +01:00
jc21
011191f645 Merge pull request #5260 from jerry-yuan/develop
Add trust_forwarded_proto option for SSL redirect handling in r…
2026-02-11 14:54:00 +10:00
Zoey
1178bfbc88 add some untested templates for auth providers 2026-02-10 23:06:36 +01:00
Zoey
d0554a2a5b Merge remote-tracking branch 'upstream/develop' into develop 2026-02-10 20:07:23 +01:00
Zoey
39ae2e6c51 fix: unsetting the acme profile doe snot reste it for existing certs, which will causes issues when switching to a diffrent ca which does not support this profile 2026-02-10 20:06:21 +01:00
Zoey
4ce99b36ee add x_frame_options to the webui (and auth_request but it does nothing currently) 2026-02-10 20:06:21 +01:00
Zoey
0a41246be9 breaking: change proxy host api/split proxy buffering button 2026-02-10 20:06:21 +01:00
Zoey
312c3f1183 keep upstreams Referrer-Policy if sent 2026-02-10 20:06:21 +01:00
Zoey
f9d89e21a8 fix #2704 2026-02-10 20:06:21 +01:00
renovate[bot]
7309882798 dep updates 2026-02-10 20:06:20 +01:00
jerry-yuan
eeab425ea4 fix: unknown "trust_forwarded_proto" variable error when run with already created old virtual hosts 2026-02-10 10:53:17 +00:00
Jamie Curnow
13fbc53591 Fix bug when adding invalid custom certs 2026-02-10 14:54:33 +10:00
dependabot[bot]
3f2aec7b86 Bump vite-tsconfig-paths in /frontend in the dev-minor-updates group
Bumps the dev-minor-updates group in /frontend with 1 update: [vite-tsconfig-paths](https://github.com/aleclarson/vite-tsconfig-paths).


Updates `vite-tsconfig-paths` from 6.0.5 to 6.1.0
- [Release notes](https://github.com/aleclarson/vite-tsconfig-paths/releases)
- [Commits](https://github.com/aleclarson/vite-tsconfig-paths/compare/v6.0.5...v6.1.0)

---
updated-dependencies:
- dependency-name: vite-tsconfig-paths
  dependency-version: 6.1.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-09 14:29:32 +00:00
dependabot[bot]
09a3d65aa1 Bump the dev-patch-updates group in /frontend with 2 updates
Bumps the dev-patch-updates group in /frontend with 2 updates: [@types/react](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/react) and [happy-dom](https://github.com/capricorn86/happy-dom).


Updates `@types/react` from 19.2.10 to 19.2.13
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/react)

Updates `happy-dom` from 20.5.0 to 20.5.3
- [Release notes](https://github.com/capricorn86/happy-dom/releases)
- [Commits](https://github.com/capricorn86/happy-dom/compare/v20.5.0...v20.5.3)

---
updated-dependencies:
- dependency-name: "@types/react"
  dependency-version: 19.2.13
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: happy-dom
  dependency-version: 20.5.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-09 14:29:16 +00:00
dependabot[bot]
c910cf9512 Bump eslint from 9.39.2 to 10.0.0 in /test
Bumps [eslint](https://github.com/eslint/eslint) from 9.39.2 to 10.0.0.
- [Release notes](https://github.com/eslint/eslint/releases)
- [Commits](https://github.com/eslint/eslint/compare/v9.39.2...v10.0.0)

---
updated-dependencies:
- dependency-name: eslint
  dependency-version: 10.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-09 14:29:11 +00:00
dependabot[bot]
304c51aae8 Bump cypress in /test in the prod-minor-updates group
Bumps the prod-minor-updates group in /test with 1 update: [cypress](https://github.com/cypress-io/cypress).


Updates `cypress` from 15.9.0 to 15.10.0
- [Release notes](https://github.com/cypress-io/cypress/releases)
- [Changelog](https://github.com/cypress-io/cypress/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/cypress-io/cypress/compare/v15.9.0...v15.10.0)

---
updated-dependencies:
- dependency-name: cypress
  dependency-version: 15.10.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-09 14:28:59 +00:00
kiaxseventh
b552eb90ed Add ArvanCloud DNS support 2026-02-09 13:02:18 +03:30
Zoey
c4e28331d3 fix #2652 2026-02-07 12:10:06 +01:00
Zoey
4b5be67742 improve readability of marked text
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-07 12:10:06 +01:00
Tomáš Novák
b78ef9bcd3 Add Czech translation and related locale files 2026-02-06 17:02:47 +01:00
Zoey
bb4286614d fix #2702 2026-02-06 07:14:54 +01:00
Jozef Gaal
7c67fafedf Update Slovak translation
Updated Slovak translations for 2FA and other features
2026-02-06 01:23:59 +01:00
Zoey
fe0600f777 fix short top secrets also for disable and backup code recreation/allow backups codes to disable totp and to reset tonew backup codes 2026-02-05 23:52:36 +01:00
renovate[bot]
d18cd479c2 dep updates 2026-02-05 23:47:06 +01:00
Zoey
3988a7713c docs: https://github.com/ZoeyVid/NPMplus/discussions/2695#discussioncomment-15704676
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-05 23:47:05 +01:00
jc21
47b367d61e Merge pull request #5276 from NginxProxyManager/develop
v2.13.7
2026-02-06 07:11:49 +10:00
Zoey
fd9e7b0644 merge upstream 2026-02-05 10:47:10 +01:00
Zoey
a78828c024 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-05 10:45:47 +01:00
Jamie Curnow
d19f5c1960 Fix upgrade problem with otplib existing secrets 2026-02-05 13:12:54 +10:00
Jamie Curnow
77662b4e7f Use better-sqlite3 package for sqlite databases 2026-02-05 13:11:57 +10:00
Jamie Curnow
c88de65d3a Fix #5274 2fa backup codes not validating properly 2026-02-05 10:51:15 +10:00
jc21
ac4efd2333 Merge branch 'master' into develop 2026-02-05 08:27:41 +10:00
Jamie Curnow
eab38d8934 Bump version 2026-02-05 08:26:49 +10:00
Zoey
117ab191a7 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-04 23:00:17 +01:00
jc21
4833dcbf3a Merge pull request #5237 from NginxProxyManager/dependabot/npm_and_yarn/backend/dev-patch-updates-2bda1081ab
Bump @biomejs/biome from 2.3.12 to 2.3.13 in /backend in the dev-patch-updates group
2026-02-05 07:58:38 +10:00
jc21
c6fba1cbfe Merge pull request #5272 from NginxProxyManager/dependabot/npm_and_yarn/backend/prod-patch-updates-627d993332
Bump mysql2 from 3.16.2 to 3.16.3 in /backend in the prod-patch-updates group
2026-02-05 07:58:00 +10:00
renovate[bot]
9bcef300b8 update nginx/dep updates 2026-02-04 21:48:46 +01:00
Zoey
94e4b51f91 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-04 10:38:18 +01:00
jc21
cdde543e8a Merge pull request #5273 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-patch-updates-1f568f1195
Bump @biomejs/biome from 2.3.13 to 2.3.14 in /frontend in the dev-patch-updates group
2026-02-04 12:06:56 +10:00
Jamie Curnow
0d62c26164 Fix linting 2026-02-04 10:43:14 +10:00
Jamie Curnow
c3173d83b8 Update biome.json to match viome version 2026-02-04 10:39:43 +10:00
Jamie Curnow
6ba40216cd Update biome.json to match viome version 2026-02-04 10:38:56 +10:00
dependabot[bot]
3c54413752 Bump mysql2 in /backend in the prod-patch-updates group
Bumps the prod-patch-updates group in /backend with 1 update: [mysql2](https://github.com/sidorares/node-mysql2).


Updates `mysql2` from 3.16.2 to 3.16.3
- [Release notes](https://github.com/sidorares/node-mysql2/releases)
- [Changelog](https://github.com/sidorares/node-mysql2/blob/master/Changelog.md)
- [Commits](https://github.com/sidorares/node-mysql2/compare/v3.16.2...v3.16.3)

---
updated-dependencies:
- dependency-name: mysql2
  dependency-version: 3.16.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-04 00:15:00 +00:00
jc21
65cf8ce583 Merge pull request #5248 from NginxProxyManager/dependabot/npm_and_yarn/backend/otplib-13.2.1
Bump otplib from 12.0.1 to 13.2.1 in /backend
2026-02-04 10:13:27 +10:00
Zoey
dcb45fcd65 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-03 23:13:42 +01:00
dependabot[bot]
a4bc8d5d21 Bump @biomejs/biome in /frontend in the dev-patch-updates group
Bumps the dev-patch-updates group in /frontend with 1 update: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome).


Updates `@biomejs/biome` from 2.3.13 to 2.3.14
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.3.14/packages/@biomejs/biome)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.3.14
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 21:52:25 +00:00
dependabot[bot]
2bcf5e91ce Bump @biomejs/biome in /backend in the dev-patch-updates group
Bumps the dev-patch-updates group in /backend with 1 update: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome).


Updates `@biomejs/biome` from 2.3.12 to 2.3.13
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.3.13/packages/@biomejs/biome)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.3.13
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 21:51:39 +00:00
Jamie Curnow
3e3d08b68f Change dependabot interval to weekly 2026-02-04 07:50:26 +10:00
Jamie Curnow
f90066822f Fix v13 otplib upgrades 2026-02-04 07:47:16 +10:00
dependabot[bot]
bb4b5fb3aa Bump otplib from 12.0.1 to 13.2.1 in /backend
Bumps [otplib](https://github.com/yeojz/otplib/tree/HEAD/packages/otplib) from 12.0.1 to 13.2.1.
- [Release notes](https://github.com/yeojz/otplib/releases)
- [Commits](https://github.com/yeojz/otplib/commits/v13.2.1/packages/otplib)

---
updated-dependencies:
- dependency-name: otplib
  dependency-version: 13.2.1
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 21:22:20 +00:00
jc21
8014f34195 Merge pull request #5269 from NginxProxyManager/dependabot/npm_and_yarn/backend/prod-minor-updates-2bc8aaf294
Bump pg from 8.17.2 to 8.18.0 in /backend in the prod-minor-updates group
2026-02-04 07:20:55 +10:00
jc21
4f8037ded2 Merge pull request #5270 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-minor-updates-1492aee52e
Bump happy-dom from 20.4.0 to 20.5.0 in /frontend in the dev-minor-updates group
2026-02-04 07:18:42 +10:00
jc21
e7a1f84e45 Merge pull request #5271 from NginxProxyManager/dependabot/npm_and_yarn/frontend/prod-patch-updates-4c40e63da3
Bump react-intl from 8.1.2 to 8.1.3 in /frontend in the prod-patch-updates group
2026-02-04 07:18:31 +10:00
renovate[bot]
7b5fa23af6 dep updates/fix biome
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-03 19:39:40 +01:00
Zoey
b473ee9299 fix #2683 2026-02-03 19:15:37 +01:00
Zoey
7ab1f56e40 merge upstream 2026-02-03 15:36:34 +01:00
Zoey
731ad48584 Merge remote-tracking branch 'upstream/develop' into develop 2026-02-03 15:27:55 +01:00
renovate[bot]
fc4278625e dep updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-02-03 14:37:54 +01:00
dependabot[bot]
6f0931bed5 Bump react-intl in /frontend in the prod-patch-updates group
Bumps the prod-patch-updates group in /frontend with 1 update: [react-intl](https://github.com/formatjs/formatjs).


Updates `react-intl` from 8.1.2 to 8.1.3
- [Release notes](https://github.com/formatjs/formatjs/releases)
- [Commits](https://github.com/formatjs/formatjs/compare/react-intl@8.1.2...react-intl@8.1.3)

---
updated-dependencies:
- dependency-name: react-intl
  dependency-version: 8.1.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 13:04:48 +00:00
dependabot[bot]
7f0c5d4364 Bump happy-dom in /frontend in the dev-minor-updates group
Bumps the dev-minor-updates group in /frontend with 1 update: [happy-dom](https://github.com/capricorn86/happy-dom).


Updates `happy-dom` from 20.4.0 to 20.5.0
- [Release notes](https://github.com/capricorn86/happy-dom/releases)
- [Commits](https://github.com/capricorn86/happy-dom/compare/v20.4.0...v20.5.0)

---
updated-dependencies:
- dependency-name: happy-dom
  dependency-version: 20.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 13:04:21 +00:00
dependabot[bot]
60404b6f7e Bump pg in /backend in the prod-minor-updates group
Bumps the prod-minor-updates group in /backend with 1 update: [pg](https://github.com/brianc/node-postgres/tree/HEAD/packages/pg).


Updates `pg` from 8.17.2 to 8.18.0
- [Changelog](https://github.com/brianc/node-postgres/blob/master/CHANGELOG.md)
- [Commits](https://github.com/brianc/node-postgres/commits/pg@8.18.0/packages/pg)

---
updated-dependencies:
- dependency-name: pg
  dependency-version: 8.18.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 13:04:13 +00:00
jc21
c2fddee2c7 Merge pull request #5264 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-patch-updates-cc9765ca56
Bump the dev-patch-updates group across 1 directory with 3 updates
2026-02-03 17:10:42 +10:00
Jerry8块
b7402d47a0 Merge branch 'NginxProxyManager:develop' into develop 2026-02-03 15:10:13 +08:00
jc21
f09876d31b Merge pull request #5252 from NginxProxyManager/dependabot/npm_and_yarn/backend/apidevtools/json-schema-ref-parser-14.1.1
Bump @apidevtools/json-schema-ref-parser from 11.9.3 to 14.1.1 in /backend
2026-02-03 17:06:56 +10:00
dependabot[bot]
8708a3bab8 Bump the dev-patch-updates group across 1 directory with 3 updates
Bumps the dev-patch-updates group with 3 updates in the /frontend directory: [@formatjs/cli](https://github.com/formatjs/formatjs), [@tanstack/react-query-devtools](https://github.com/TanStack/query/tree/HEAD/packages/react-query-devtools) and [@vitejs/plugin-react](https://github.com/vitejs/vite-plugin-react/tree/HEAD/packages/plugin-react).


Updates `@formatjs/cli` from 6.12.0 to 6.12.1
- [Release notes](https://github.com/formatjs/formatjs/releases)
- [Commits](https://github.com/formatjs/formatjs/compare/@formatjs/cli@6.12.0...@formatjs/cli@6.12.1)

Updates `@tanstack/react-query-devtools` from 5.91.2 to 5.91.3
- [Release notes](https://github.com/TanStack/query/releases)
- [Changelog](https://github.com/TanStack/query/blob/main/packages/react-query-devtools/CHANGELOG.md)
- [Commits](https://github.com/TanStack/query/commits/@tanstack/react-query-devtools@5.91.3/packages/react-query-devtools)

Updates `@vitejs/plugin-react` from 5.1.2 to 5.1.3
- [Release notes](https://github.com/vitejs/vite-plugin-react/releases)
- [Changelog](https://github.com/vitejs/vite-plugin-react/blob/main/packages/plugin-react/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite-plugin-react/commits/plugin-react@5.1.3/packages/plugin-react)

---
updated-dependencies:
- dependency-name: "@formatjs/cli"
  dependency-version: 6.12.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: "@tanstack/react-query-devtools"
  dependency-version: 5.91.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: "@vitejs/plugin-react"
  dependency-version: 5.1.3
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-03 07:06:20 +00:00
jc21
218fadd168 Merge pull request #5254 from NginxProxyManager/dependabot/npm_and_yarn/backend/body-parser-2.2.2
Bump body-parser from 1.20.4 to 2.2.2 in /backend
2026-02-03 17:04:49 +10:00
jc21
9cf1d000c8 Merge pull request #5257 from maghuro/add-pt-pt
Add pt-PT lang
2026-02-03 17:04:19 +10:00
jc21
714bebbbc7 Merge pull request #5263 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-patch-updates-5b27633cb0
Bump @quobix/vacuum from 0.23.4 to 0.23.5 in /test in the prod-patch-updates group
2026-02-03 17:02:59 +10:00
jc21
127008c9b5 Merge pull request #5265 from NginxProxyManager/dependabot/npm_and_yarn/frontend/react-intl-8.1.2
Bump react-intl from 7.1.14 to 8.1.2 in /frontend
2026-02-03 17:02:18 +10:00
dependabot[bot]
7cc2bfbf6a Bump react-intl from 7.1.14 to 8.1.2 in /frontend
Bumps [react-intl](https://github.com/formatjs/formatjs) from 7.1.14 to 8.1.2.
- [Release notes](https://github.com/formatjs/formatjs/releases)
- [Commits](https://github.com/formatjs/formatjs/compare/react-intl@7.1.14...react-intl@8.1.2)

---
updated-dependencies:
- dependency-name: react-intl
  dependency-version: 8.1.2
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-02 14:26:07 +00:00
dependabot[bot]
de3b543d08 Bump @quobix/vacuum in /test in the prod-patch-updates group
Bumps the prod-patch-updates group in /test with 1 update: [@quobix/vacuum](https://github.com/daveshanley/vacuum).


Updates `@quobix/vacuum` from 0.23.4 to 0.23.5
- [Release notes](https://github.com/daveshanley/vacuum/releases)
- [Commits](https://github.com/daveshanley/vacuum/compare/v0.23.4...v0.23.5)

---
updated-dependencies:
- dependency-name: "@quobix/vacuum"
  dependency-version: 0.23.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-02-02 14:25:26 +00:00
jerry-yuan
21f63e3db3 fix: delete advanced options from redir_host/dead_host/streams 2026-02-01 10:38:09 +00:00
Jerry
232b5b759a fix: make variable name meaningful 2026-02-01 00:16:17 +08:00
jerry-yuan
054742539f fix: Supplement Swagger documentation 2026-01-31 14:17:05 +00:00
jerry-yuan
2b6a617599 fix: reformat migration scripts 2026-01-31 13:28:53 +00:00
jerry-yuan
187d21a0d5 feat: add trust_forwarded_proto option for SSL redirect handling in reverse proxy scenarios
When Nginx is behind another proxy server (like CloudFlare or AWS ALB), the force-SSL
feature can cause redirect loops because Nginx sees the connection as plain HTTP
while SSL is already handled upstream. This adds a new boolean option to trust
the X-Forwarded-Proto header from upstream proxies.

Changes:
- Add `trust_forwarded_proto` column to proxy_host table (migration)
- Update model and API schema to support the new boolean field
- Modify force-ssl Nginx template to check X-Forwarded-Proto/X-Forwarded-Scheme
- Add map directives in nginx.conf to validate and sanitize forwarded headers
- Add advanced option toggle in frontend UI with i18n support (EN/ZH)
- Set proxy headers from validated map variables instead of $scheme

This allows administrators to control SSL redirect behavior when Nginx is deployed
behind a TLS-terminating proxy.
2026-01-31 13:11:47 +00:00
maghuro
c515815b0e Remove merge conflict markers from lang-list.json 2026-01-31 12:10:36 +00:00
maghuro
3db02370fd Add Portuguese language support to IntlProvider 2026-01-31 12:08:07 +00:00
maghuro
4ad1af5576 Remove duplicate locale entries and keep pt-PT 2026-01-31 12:07:32 +00:00
maghuro
a73d54fedc Add Portuguese (European) language support 2026-01-31 12:06:50 +00:00
maghuro
8c8005f817 Add Portuguese language support to HelpDoc 2026-01-31 12:05:32 +00:00
maghuro
83d993578b Add pt-PT lang
Add Portuguese (European) language
2026-01-31 11:59:35 +00:00
Zoey
1b74559dc8 Merge remote-tracking branch 'upstream/develop' into develop 2026-01-30 22:14:28 +01:00
renovate[bot]
d8f25eb304 Update dependency @tanstack/react-query-devtools to v5.91.3 2026-01-30 14:48:09 +01:00
dependabot[bot]
8532e7520f Bump body-parser from 1.20.4 to 2.2.2 in /backend
Bumps [body-parser](https://github.com/expressjs/body-parser) from 1.20.4 to 2.2.2.
- [Release notes](https://github.com/expressjs/body-parser/releases)
- [Changelog](https://github.com/expressjs/body-parser/blob/master/HISTORY.md)
- [Commits](https://github.com/expressjs/body-parser/compare/1.20.4...v2.2.2)

---
updated-dependencies:
- dependency-name: body-parser
  dependency-version: 2.2.2
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-30 13:04:20 +00:00
dependabot[bot]
58d47cd69a Bump @apidevtools/json-schema-ref-parser in /backend
Bumps [@apidevtools/json-schema-ref-parser](https://github.com/APIDevTools/json-schema-ref-parser) from 11.9.3 to 14.1.1.
- [Release notes](https://github.com/APIDevTools/json-schema-ref-parser/releases)
- [Commits](https://github.com/APIDevTools/json-schema-ref-parser/compare/v11.9.3...v14.1.1)

---
updated-dependencies:
- dependency-name: "@apidevtools/json-schema-ref-parser"
  dependency-version: 14.1.1
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-30 13:04:04 +00:00
jc21
bad3eac515 Merge pull request #5245 from NginxProxyManager/dependabot/npm_and_yarn/backend/archiver-7.0.1
Bump archiver from 5.3.2 to 7.0.1 in /backend
2026-01-30 13:40:19 +10:00
dependabot[bot]
00b58f73f8 Bump archiver from 5.3.2 to 7.0.1 in /backend
Bumps [archiver](https://github.com/archiverjs/node-archiver) from 5.3.2 to 7.0.1.
- [Release notes](https://github.com/archiverjs/node-archiver/releases)
- [Changelog](https://github.com/archiverjs/node-archiver/blob/master/CHANGELOG.md)
- [Commits](https://github.com/archiverjs/node-archiver/compare/5.3.2...7.0.1)

---
updated-dependencies:
- dependency-name: archiver
  dependency-version: 7.0.1
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-30 01:35:10 +00:00
jc21
47981f0d56 Merge pull request #5230 from NginxProxyManager/dependabot/npm_and_yarn/frontend/prod-minor-updates-37a0ff9301
Bump the prod-minor-updates group in /frontend with 4 updates
2026-01-30 11:33:53 +10:00
jc21
38257859e2 Merge pull request #5244 from NginxProxyManager/dependabot/npm_and_yarn/backend/bcrypt-6.0.0
Bump bcrypt from 5.1.1 to 6.0.0 in /backend
2026-01-30 11:33:34 +10:00
dependabot[bot]
a169e1131c Bump the prod-minor-updates group in /frontend with 4 updates
Bumps the prod-minor-updates group in /frontend with 4 updates: [@tabler/icons-react](https://github.com/tabler/tabler-icons/tree/HEAD/packages/icons-react), [country-flag-icons](https://gitlab.com/catamphetamine/country-flag-icons), [react-router-dom](https://github.com/remix-run/react-router/tree/HEAD/packages/react-router-dom) and [rooks](https://github.com/imbhargav5/rooks).


Updates `@tabler/icons-react` from 3.35.0 to 3.36.1
- [Release notes](https://github.com/tabler/tabler-icons/releases)
- [Commits](https://github.com/tabler/tabler-icons/commits/v3.36.1/packages/icons-react)

Updates `country-flag-icons` from 1.5.21 to 1.6.8
- [Changelog](https://gitlab.com/catamphetamine/country-flag-icons/blob/master/CHANGELOG.md)
- [Commits](https://gitlab.com/catamphetamine/country-flag-icons/compare/v1.5.21...v1.6.8)

Updates `react-router-dom` from 7.9.5 to 7.13.0
- [Release notes](https://github.com/remix-run/react-router/releases)
- [Changelog](https://github.com/remix-run/react-router/blob/main/packages/react-router-dom/CHANGELOG.md)
- [Commits](https://github.com/remix-run/react-router/commits/react-router-dom@7.13.0/packages/react-router-dom)

Updates `rooks` from 9.3.0 to 9.5.0
- [Release notes](https://github.com/imbhargav5/rooks/releases)
- [Commits](https://github.com/imbhargav5/rooks/compare/rooks@9.3.0...rooks@9.5.0)

---
updated-dependencies:
- dependency-name: "@tabler/icons-react"
  dependency-version: 3.36.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: country-flag-icons
  dependency-version: 1.6.8
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: react-router-dom
  dependency-version: 7.13.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: rooks
  dependency-version: 9.5.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-29 23:36:00 +00:00
dependabot[bot]
a99cde9cd8 Bump bcrypt from 5.1.1 to 6.0.0 in /backend
Bumps [bcrypt](https://github.com/kelektiv/node.bcrypt.js) from 5.1.1 to 6.0.0.
- [Release notes](https://github.com/kelektiv/node.bcrypt.js/releases)
- [Changelog](https://github.com/kelektiv/node.bcrypt.js/blob/master/CHANGELOG.md)
- [Commits](https://github.com/kelektiv/node.bcrypt.js/compare/v5.1.1...v6.0.0)

---
updated-dependencies:
- dependency-name: bcrypt
  dependency-version: 6.0.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-29 23:33:36 +00:00
jc21
c69bd187af Merge pull request #5243 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-patch-updates-4953ba4782
Bump axios from 1.13.3 to 1.13.4 in /test in the prod-patch-updates group
2026-01-30 09:33:18 +10:00
jc21
98fe622967 Merge pull request #5246 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-minor-updates-a6c26cdf84
Bump happy-dom from 20.3.9 to 20.4.0 in /frontend in the dev-minor-updates group
2026-01-30 09:33:05 +10:00
jc21
eddca3597d Merge pull request #5247 from NginxProxyManager/dependabot/npm_and_yarn/backend/express-5.2.1
Bump express from 4.22.0 to 5.2.1 in /backend
2026-01-30 09:31:33 +10:00
jc21
ed0b2306a2 Merge pull request #5250 from NginxProxyManager/dependabot/npm_and_yarn/test/tar-7.5.7
Bump tar from 7.5.6 to 7.5.7 in /test
2026-01-30 09:31:23 +10:00
jc21
17f6050de2 Merge pull request #5235 from NginxProxyManager/dependabot/npm_and_yarn/frontend/prod-patch-updates-9d9e6eac1f
Bump the prod-patch-updates group across 1 directory with 4 updates
2026-01-30 09:31:12 +10:00
dependabot[bot]
469d72a2f9 Bump tar from 7.5.6 to 7.5.7 in /test
Bumps [tar](https://github.com/isaacs/node-tar) from 7.5.6 to 7.5.7.
- [Release notes](https://github.com/isaacs/node-tar/releases)
- [Changelog](https://github.com/isaacs/node-tar/blob/main/CHANGELOG.md)
- [Commits](https://github.com/isaacs/node-tar/compare/v7.5.6...v7.5.7)

---
updated-dependencies:
- dependency-name: tar
  dependency-version: 7.5.7
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-29 14:02:55 +00:00
Zoey
9006ba3eba Merge remote-tracking branch 'upstream/develop' into develop 2026-01-29 14:40:42 +01:00
renovate[bot]
e10cc22ff6 dep updates/Comment out ACME_EMAIL env by default 2026-01-29 14:39:09 +01:00
dependabot[bot]
3ed3ec0001 Bump express from 4.22.0 to 5.2.1 in /backend
Bumps [express](https://github.com/expressjs/express) from 4.22.0 to 5.2.1.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/master/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.22.0...v5.2.1)

---
updated-dependencies:
- dependency-name: express
  dependency-version: 5.2.1
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-28 13:04:19 +00:00
dependabot[bot]
24ff3c7b11 Bump happy-dom in /frontend in the dev-minor-updates group
Bumps the dev-minor-updates group in /frontend with 1 update: [happy-dom](https://github.com/capricorn86/happy-dom).


Updates `happy-dom` from 20.3.9 to 20.4.0
- [Release notes](https://github.com/capricorn86/happy-dom/releases)
- [Commits](https://github.com/capricorn86/happy-dom/compare/v20.3.9...v20.4.0)

---
updated-dependencies:
- dependency-name: happy-dom
  dependency-version: 20.4.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-28 13:04:12 +00:00
dependabot[bot]
58dda941b8 Bump axios in /test in the prod-patch-updates group
Bumps the prod-patch-updates group in /test with 1 update: [axios](https://github.com/axios/axios).


Updates `axios` from 1.13.3 to 1.13.4
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.13.3...v1.13.4)

---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.13.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-28 13:03:42 +00:00
dependabot[bot]
f9f743499f Bump the prod-patch-updates group across 1 directory with 4 updates
Bumps the prod-patch-updates group with 4 updates in the /frontend directory: [@tanstack/react-query](https://github.com/TanStack/query/tree/HEAD/packages/react-query), [formik](https://github.com/jaredpalmer/formik), [react](https://github.com/facebook/react/tree/HEAD/packages/react) and [react-dom](https://github.com/facebook/react/tree/HEAD/packages/react-dom).


Updates `@tanstack/react-query` from 5.90.6 to 5.90.20
- [Release notes](https://github.com/TanStack/query/releases)
- [Changelog](https://github.com/TanStack/query/blob/main/packages/react-query/CHANGELOG.md)
- [Commits](https://github.com/TanStack/query/commits/@tanstack/react-query@5.90.20/packages/react-query)

Updates `formik` from 2.4.6 to 2.4.9
- [Release notes](https://github.com/jaredpalmer/formik/releases)
- [Commits](https://github.com/jaredpalmer/formik/compare/formik@2.4.6...formik@2.4.9)

Updates `react` from 19.2.3 to 19.2.4
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.4/packages/react)

Updates `react-dom` from 19.2.3 to 19.2.4
- [Release notes](https://github.com/facebook/react/releases)
- [Changelog](https://github.com/facebook/react/blob/main/CHANGELOG.md)
- [Commits](https://github.com/facebook/react/commits/v19.2.4/packages/react-dom)

---
updated-dependencies:
- dependency-name: "@tanstack/react-query"
  dependency-version: 5.90.20
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: formik
  dependency-version: 2.4.9
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: react
  dependency-version: 19.2.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: react-dom
  dependency-version: 19.2.4
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-28 05:00:34 +00:00
Jamie Curnow
534afe6067 Implement suggestion from #5216 hopefully rectifying https -> forced https hosts 2026-01-28 14:04:32 +10:00
jc21
9580903f5d Merge pull request #5239 from NginxProxyManager/dependabot/npm_and_yarn/backend/apidevtools/swagger-parser-12.1.0
Bump @apidevtools/swagger-parser from 10.1.1 to 12.1.0 in /backend
2026-01-28 13:39:51 +10:00
dependabot[bot]
df81c8425f Bump @apidevtools/swagger-parser from 10.1.1 to 12.1.0 in /backend
Bumps [@apidevtools/swagger-parser](https://github.com/APIDevTools/swagger-parser) from 10.1.1 to 12.1.0.
- [Release notes](https://github.com/APIDevTools/swagger-parser/releases)
- [Changelog](https://github.com/APIDevTools/swagger-parser/blob/main/CHANGELOG.md)
- [Commits](https://github.com/APIDevTools/swagger-parser/compare/v10.1.1...v12.1.0)

---
updated-dependencies:
- dependency-name: "@apidevtools/swagger-parser"
  dependency-version: 12.1.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-28 02:56:17 +00:00
Jamie Curnow
b6f421c5fc Update biome schema 2026-01-28 12:54:55 +10:00
Zoey
1dd345fb1b Merge remote-tracking branch 'upstream/develop' into develop 2026-01-27 23:24:43 +01:00
Zoey
d57f84f973 add prefix to npmplus unique columns 2026-01-27 23:24:18 +01:00
Zoey
6075d1d433 nginx patch: do not change request_body_file_log_level directly 2026-01-27 23:24:18 +01:00
Zoey
93522c0879 merge upstream/dep updates 2026-01-27 23:24:17 +01:00
jc21
c1ef3a3795 Merge pull request #5238 from NginxProxyManager/dependabot/npm_and_yarn/backend/chalk-5.6.2
Bump chalk from 4.1.2 to 5.6.2 in /backend
2026-01-28 07:45:10 +10:00
jc21
0aad939ccc Merge pull request #5221 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-minor-updates-9ff43a5ae3
Bump @quobix/vacuum from 0.19.4 to 0.23.4 in /test in the prod-minor-updates group
2026-01-28 07:44:41 +10:00
jc21
7e092e265c Merge pull request #5222 from NginxProxyManager/dependabot/npm_and_yarn/backend/prod-minor-updates-61aa9782cd
Bump the prod-minor-updates group in /backend with 4 updates
2026-01-28 07:44:20 +10:00
jc21
cd01a2ee6b Merge pull request #5233 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-patch-updates-dcc4fa4550
Bump the dev-patch-updates group across 1 directory with 3 updates
2026-01-28 07:43:59 +10:00
jc21
9e6720561a Merge pull request #5234 from NginxProxyManager/dependabot/npm_and_yarn/test/prod-patch-updates-cda2baf714
Bump the prod-patch-updates group across 1 directory with 5 updates
2026-01-28 07:43:46 +10:00
dependabot[bot]
c50f0a144e Bump the prod-minor-updates group in /backend with 4 updates
Bumps the prod-minor-updates group in /backend with 4 updates: [liquidjs](https://github.com/harttle/liquidjs), [mysql2](https://github.com/sidorares/node-mysql2), [objection](https://github.com/vincit/objection.js) and [pg](https://github.com/brianc/node-postgres/tree/HEAD/packages/pg).


Updates `liquidjs` from 10.6.1 to 10.24.0
- [Release notes](https://github.com/harttle/liquidjs/releases)
- [Changelog](https://github.com/harttle/liquidjs/blob/master/CHANGELOG.md)
- [Commits](https://github.com/harttle/liquidjs/compare/v10.6.1...v10.24.0)

Updates `mysql2` from 3.15.3 to 3.16.1
- [Release notes](https://github.com/sidorares/node-mysql2/releases)
- [Changelog](https://github.com/sidorares/node-mysql2/blob/master/Changelog.md)
- [Commits](https://github.com/sidorares/node-mysql2/compare/v3.15.3...v3.16.1)

Updates `objection` from 3.0.1 to 3.1.5
- [Commits](https://github.com/vincit/objection.js/compare/3.0.1...3.1.5)

Updates `pg` from 8.16.3 to 8.17.2
- [Changelog](https://github.com/brianc/node-postgres/blob/master/CHANGELOG.md)
- [Commits](https://github.com/brianc/node-postgres/commits/pg@8.17.2/packages/pg)

---
updated-dependencies:
- dependency-name: liquidjs
  dependency-version: 10.24.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: mysql2
  dependency-version: 3.16.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: objection
  dependency-version: 3.1.5
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
- dependency-name: pg
  dependency-version: 8.17.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-27 13:02:36 +00:00
dependabot[bot]
2a9c1df3cb Bump chalk from 4.1.2 to 5.6.2 in /backend
Bumps [chalk](https://github.com/chalk/chalk) from 4.1.2 to 5.6.2.
- [Release notes](https://github.com/chalk/chalk/releases)
- [Commits](https://github.com/chalk/chalk/compare/v4.1.2...v5.6.2)

---
updated-dependencies:
- dependency-name: chalk
  dependency-version: 5.6.2
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-27 13:02:11 +00:00
dependabot[bot]
ef6391f22e Bump @quobix/vacuum in /test in the prod-minor-updates group
Bumps the prod-minor-updates group in /test with 1 update: [@quobix/vacuum](https://github.com/daveshanley/vacuum).


Updates `@quobix/vacuum` from 0.19.4 to 0.23.4
- [Release notes](https://github.com/daveshanley/vacuum/releases)
- [Commits](https://github.com/daveshanley/vacuum/compare/v0.19.4...v0.23.4)

---
updated-dependencies:
- dependency-name: "@quobix/vacuum"
  dependency-version: 0.23.4
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: prod-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-27 11:42:28 +00:00
dependabot[bot]
0f46337710 Bump the dev-patch-updates group across 1 directory with 3 updates
Bumps the dev-patch-updates group with 3 updates in the /frontend directory: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome), [@testing-library/react](https://github.com/testing-library/react-testing-library) and [vitest](https://github.com/vitest-dev/vitest/tree/HEAD/packages/vitest).


Updates `@biomejs/biome` from 2.3.2 to 2.3.13
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.3.13/packages/@biomejs/biome)

Updates `@testing-library/react` from 16.3.0 to 16.3.2
- [Release notes](https://github.com/testing-library/react-testing-library/releases)
- [Changelog](https://github.com/testing-library/react-testing-library/blob/main/CHANGELOG.md)
- [Commits](https://github.com/testing-library/react-testing-library/compare/v16.3.0...v16.3.2)

Updates `vitest` from 4.0.6 to 4.0.18
- [Release notes](https://github.com/vitest-dev/vitest/releases)
- [Commits](https://github.com/vitest-dev/vitest/commits/v4.0.18/packages/vitest)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.3.13
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: "@testing-library/react"
  dependency-version: 16.3.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
- dependency-name: vitest
  dependency-version: 4.0.18
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-27 11:42:25 +00:00
dependabot[bot]
1b84b8ace2 Bump the prod-patch-updates group across 1 directory with 5 updates
Bumps the prod-patch-updates group with 5 updates in the /test directory:

| Package | From | To |
| --- | --- | --- |
| [axios](https://github.com/axios/axios) | `1.13.1` | `1.13.3` |
| [eslint](https://github.com/eslint/eslint) | `9.39.0` | `9.39.2` |
| [eslint-plugin-cypress](https://github.com/cypress-io/eslint-plugin-cypress) | `5.2.0` | `5.2.1` |
| [form-data](https://github.com/form-data/form-data) | `4.0.4` | `4.0.5` |
| [mocha](https://github.com/mochajs/mocha) | `11.7.4` | `11.7.5` |



Updates `axios` from 1.13.1 to 1.13.3
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.13.1...v1.13.3)

Updates `eslint` from 9.39.0 to 9.39.2
- [Release notes](https://github.com/eslint/eslint/releases)
- [Commits](https://github.com/eslint/eslint/compare/v9.39.0...v9.39.2)

Updates `eslint-plugin-cypress` from 5.2.0 to 5.2.1
- [Release notes](https://github.com/cypress-io/eslint-plugin-cypress/releases)
- [Commits](https://github.com/cypress-io/eslint-plugin-cypress/compare/v5.2.0...v5.2.1)

Updates `form-data` from 4.0.4 to 4.0.5
- [Release notes](https://github.com/form-data/form-data/releases)
- [Changelog](https://github.com/form-data/form-data/blob/master/CHANGELOG.md)
- [Commits](https://github.com/form-data/form-data/compare/v4.0.4...v4.0.5)

Updates `mocha` from 11.7.4 to 11.7.5
- [Release notes](https://github.com/mochajs/mocha/releases)
- [Changelog](https://github.com/mochajs/mocha/blob/v11.7.5/CHANGELOG.md)
- [Commits](https://github.com/mochajs/mocha/compare/v11.7.4...v11.7.5)

---
updated-dependencies:
- dependency-name: axios
  dependency-version: 1.13.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: eslint
  dependency-version: 9.39.2
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: eslint-plugin-cypress
  dependency-version: 5.2.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: form-data
  dependency-version: 4.0.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
- dependency-name: mocha
  dependency-version: 11.7.5
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-27 11:41:51 +00:00
Zoey
2acf184bd1 Merge remote-tracking branch 'upstream/develop' into develop 2026-01-27 11:56:58 +01:00
renovate[bot]
4ab9077013 dep updates 2026-01-27 11:55:48 +01:00
Jamie Curnow
8ea8286cec More cypress fixes 2026-01-27 14:02:23 +10:00
Jamie Curnow
7ca48f876b Ugh cypress changed their exec result format. 2026-01-27 11:55:54 +10:00
Jamie Curnow
7c3c59c79f Fix cypress logger 2026-01-27 11:41:12 +10:00
Jamie Curnow
ef7f444404 Update docker image to match js version 2026-01-27 11:27:21 +10:00
Jamie Curnow
f509e0bdba Missing export 2026-01-27 11:26:54 +10:00
Jamie Curnow
9b7af474bb Cypress ... 2026-01-27 11:22:16 +10:00
Jamie Curnow
28982b8bc2 Updated config files for cypress 2026-01-27 10:46:30 +10:00
jc21
19e654b998 Merge pull request #5228 from NginxProxyManager/dependabot/npm_and_yarn/frontend/dev-minor-updates-79aa50ef1e
Bump the dev-minor-updates group in /frontend with 6 updates
2026-01-27 08:48:40 +10:00
Jamie Curnow
eaf9f5ab1e Linting/sorting for lang 2026-01-27 08:45:57 +10:00
Jamie Curnow
4af0a968f0 Cypress module conversion and updated chalk 2026-01-27 08:45:23 +10:00
jc21
df06eb6c2f Merge pull request #5204 from NginxProxyManager/dependabot/npm_and_yarn/frontend/lodash-4.17.23
Bump lodash from 4.17.21 to 4.17.23 in /frontend
2026-01-27 08:06:38 +10:00
jc21
74360cc9b3 Merge pull request #5205 from NginxProxyManager/dependabot/npm_and_yarn/test/lodash-4.17.23
Bump lodash from 4.17.21 to 4.17.23 in /test
2026-01-27 08:06:29 +10:00
jc21
16a301fc64 Merge pull request #5227 from NginxProxyManager/dependabot/npm_and_yarn/backend/knex-3.1.0
Bump knex from 2.4.2 to 3.1.0 in /backend
2026-01-27 08:02:18 +10:00
dependabot[bot]
2d774124dc Bump the dev-minor-updates group in /frontend with 6 updates
Bumps the dev-minor-updates group in /frontend with 6 updates:

| Package | From | To |
| --- | --- | --- |
| [@formatjs/cli](https://github.com/formatjs/formatjs) | `6.7.4` | `6.12.0` |
| [@tanstack/react-query-devtools](https://github.com/TanStack/query/tree/HEAD/packages/react-query-devtools) | `5.90.2` | `5.91.2` |
| [happy-dom](https://github.com/capricorn86/happy-dom) | `20.0.10` | `20.3.7` |
| [sass](https://github.com/sass/dart-sass) | `1.93.3` | `1.97.3` |
| [vite](https://github.com/vitejs/vite/tree/HEAD/packages/vite) | `7.1.12` | `7.3.1` |
| [vite-plugin-checker](https://github.com/fi3ework/vite-plugin-checker) | `0.11.0` | `0.12.0` |


Updates `@formatjs/cli` from 6.7.4 to 6.12.0
- [Release notes](https://github.com/formatjs/formatjs/releases)
- [Commits](https://github.com/formatjs/formatjs/compare/@formatjs/cli@6.7.4...@formatjs/cli@6.12.0)

Updates `@tanstack/react-query-devtools` from 5.90.2 to 5.91.2
- [Release notes](https://github.com/TanStack/query/releases)
- [Changelog](https://github.com/TanStack/query/blob/main/packages/react-query-devtools/CHANGELOG.md)
- [Commits](https://github.com/TanStack/query/commits/@tanstack/react-query-devtools@5.91.2/packages/react-query-devtools)

Updates `happy-dom` from 20.0.10 to 20.3.7
- [Release notes](https://github.com/capricorn86/happy-dom/releases)
- [Commits](https://github.com/capricorn86/happy-dom/compare/v20.0.10...v20.3.7)

Updates `sass` from 1.93.3 to 1.97.3
- [Release notes](https://github.com/sass/dart-sass/releases)
- [Changelog](https://github.com/sass/dart-sass/blob/main/CHANGELOG.md)
- [Commits](https://github.com/sass/dart-sass/compare/1.93.3...1.97.3)

Updates `vite` from 7.1.12 to 7.3.1
- [Release notes](https://github.com/vitejs/vite/releases)
- [Changelog](https://github.com/vitejs/vite/blob/v7.3.1/packages/vite/CHANGELOG.md)
- [Commits](https://github.com/vitejs/vite/commits/v7.3.1/packages/vite)

Updates `vite-plugin-checker` from 0.11.0 to 0.12.0
- [Release notes](https://github.com/fi3ework/vite-plugin-checker/releases)
- [Commits](https://github.com/fi3ework/vite-plugin-checker/compare/vite-plugin-checker@0.11.0...vite-plugin-checker@0.12.0)

---
updated-dependencies:
- dependency-name: "@formatjs/cli"
  dependency-version: 6.12.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
- dependency-name: "@tanstack/react-query-devtools"
  dependency-version: 5.91.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
- dependency-name: happy-dom
  dependency-version: 20.3.7
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
- dependency-name: sass
  dependency-version: 1.97.3
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
- dependency-name: vite
  dependency-version: 7.3.1
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
- dependency-name: vite-plugin-checker
  dependency-version: 0.12.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: dev-minor-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-26 21:21:07 +00:00
dependabot[bot]
124737bbc6 Bump knex from 2.4.2 to 3.1.0 in /backend
Bumps [knex](https://github.com/knex/knex) from 2.4.2 to 3.1.0.
- [Release notes](https://github.com/knex/knex/releases)
- [Changelog](https://github.com/knex/knex/blob/master/CHANGELOG.md)
- [Commits](https://github.com/knex/knex/compare/2.4.2...3.1.0)

---
updated-dependencies:
- dependency-name: knex
  dependency-version: 3.1.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-26 21:19:32 +00:00
jc21
d5d222ef2d Merge pull request #5217 from NginxProxyManager/dependabot/github_actions/actions/stale-10
Bump actions/stale from 9 to 10
2026-01-27 07:19:17 +10:00
jc21
b96e932c64 Merge pull request #5218 from NginxProxyManager/dependabot/npm_and_yarn/backend/dev-patch-updates-166e475698
Bump @biomejs/biome from 2.3.2 to 2.3.12 in /backend in the dev-patch-updates group
2026-01-27 07:18:25 +10:00
jc21
d09cb2884c Merge pull request #5225 from NginxProxyManager/dependabot/npm_and_yarn/backend/nodemon-3.1.11
Bump nodemon from 2.0.22 to 3.1.11 in /backend
2026-01-27 07:18:11 +10:00
jc21
71deabcc67 Merge pull request #5219 from NginxProxyManager/dependabot/npm_and_yarn/backend/prod-patch-updates-1dc931d47a
Bump jsonwebtoken from 9.0.2 to 9.0.3 in /backend in the prod-patch-updates group
2026-01-27 07:17:20 +10:00
jc21
a78039b65f Merge pull request #5226 from NginxProxyManager/dependabot/npm_and_yarn/test/cypress-15.9.0
Bump cypress from 14.5.4 to 15.9.0 in /test
2026-01-27 07:16:48 +10:00
jc21
48acbd33ab Merge pull request #5231 from NginxProxyManager/dependabot/npm_and_yarn/frontend/vite-tsconfig-paths-6.0.5
Bump vite-tsconfig-paths from 5.1.4 to 6.0.5 in /frontend
2026-01-27 07:16:05 +10:00
dependabot[bot]
32cabc0f83 Bump vite-tsconfig-paths from 5.1.4 to 6.0.5 in /frontend
Bumps [vite-tsconfig-paths](https://github.com/aleclarson/vite-tsconfig-paths) from 5.1.4 to 6.0.5.
- [Release notes](https://github.com/aleclarson/vite-tsconfig-paths/releases)
- [Commits](https://github.com/aleclarson/vite-tsconfig-paths/compare/v5.1.4...v6.0.5)

---
updated-dependencies:
- dependency-name: vite-tsconfig-paths
  dependency-version: 6.0.5
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:33:54 +00:00
dependabot[bot]
03a82cd861 Bump cypress from 14.5.4 to 15.9.0 in /test
Bumps [cypress](https://github.com/cypress-io/cypress) from 14.5.4 to 15.9.0.
- [Release notes](https://github.com/cypress-io/cypress/releases)
- [Changelog](https://github.com/cypress-io/cypress/blob/develop/CHANGELOG.md)
- [Commits](https://github.com/cypress-io/cypress/compare/v14.5.4...v15.9.0)

---
updated-dependencies:
- dependency-name: cypress
  dependency-version: 15.9.0
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:31:34 +00:00
dependabot[bot]
5f19f7125e Bump nodemon from 2.0.22 to 3.1.11 in /backend
Bumps [nodemon](https://github.com/remy/nodemon) from 2.0.22 to 3.1.11.
- [Release notes](https://github.com/remy/nodemon/releases)
- [Commits](https://github.com/remy/nodemon/compare/v2.0.22...v3.1.11)

---
updated-dependencies:
- dependency-name: nodemon
  dependency-version: 3.1.11
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:31:28 +00:00
dependabot[bot]
8d35644190 Bump jsonwebtoken in /backend in the prod-patch-updates group
Bumps the prod-patch-updates group in /backend with 1 update: [jsonwebtoken](https://github.com/auth0/node-jsonwebtoken).


Updates `jsonwebtoken` from 9.0.2 to 9.0.3
- [Changelog](https://github.com/auth0/node-jsonwebtoken/blob/master/CHANGELOG.md)
- [Commits](https://github.com/auth0/node-jsonwebtoken/compare/v9.0.2...v9.0.3)

---
updated-dependencies:
- dependency-name: jsonwebtoken
  dependency-version: 9.0.3
  dependency-type: direct:production
  update-type: version-update:semver-patch
  dependency-group: prod-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:30:49 +00:00
dependabot[bot]
ad2e4c8afe Bump @biomejs/biome in /backend in the dev-patch-updates group
Bumps the dev-patch-updates group in /backend with 1 update: [@biomejs/biome](https://github.com/biomejs/biome/tree/HEAD/packages/@biomejs/biome).


Updates `@biomejs/biome` from 2.3.2 to 2.3.12
- [Release notes](https://github.com/biomejs/biome/releases)
- [Changelog](https://github.com/biomejs/biome/blob/main/packages/@biomejs/biome/CHANGELOG.md)
- [Commits](https://github.com/biomejs/biome/commits/@biomejs/biome@2.3.12/packages/@biomejs/biome)

---
updated-dependencies:
- dependency-name: "@biomejs/biome"
  dependency-version: 2.3.12
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: dev-patch-updates
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:30:27 +00:00
dependabot[bot]
69f9031447 Bump actions/stale from 9 to 10
Bumps [actions/stale](https://github.com/actions/stale) from 9 to 10.
- [Release notes](https://github.com/actions/stale/releases)
- [Changelog](https://github.com/actions/stale/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/stale/compare/v9...v10)

---
updated-dependencies:
- dependency-name: actions/stale
  dependency-version: '10'
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:30:04 +00:00
jc21
3308a308df Merge pull request #5185 from Lokowitz/add-dependa-config
Add Dependabot config
2026-01-25 22:29:28 +10:00
jc21
59b0e75324 Merge pull request #5200 from toviszsolt/lang-hungarian
Add Hungarian language support and help documentation
2026-01-25 22:14:50 +10:00
Zoey
5ef312792b Merge remote-tracking branch 'upstream/develop' into develop 2026-01-25 13:13:38 +01:00
dependabot[bot]
727bc944ea Bump lodash from 4.17.21 to 4.17.23 in /frontend
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.21 to 4.17.23.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.21...4.17.23)

---
updated-dependencies:
- dependency-name: lodash
  dependency-version: 4.17.23
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:10:48 +00:00
dependabot[bot]
a0ef0d9048 Bump lodash from 4.17.21 to 4.17.23 in /test
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.21 to 4.17.23.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.21...4.17.23)

---
updated-dependencies:
- dependency-name: lodash
  dependency-version: 4.17.23
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-25 12:10:44 +00:00
jc21
d2e346c912 Merge pull request #5203 from NginxProxyManager/dependabot/npm_and_yarn/frontend/lodash-es-4.17.23
Bump lodash-es from 4.17.21 to 4.17.23 in /frontend
2026-01-25 22:09:59 +10:00
jc21
32a716b3a9 Merge pull request #5206 from NginxProxyManager/dependabot/npm_and_yarn/backend/lodash-4.17.23
Bump lodash from 4.17.21 to 4.17.23 in /backend
2026-01-25 22:09:32 +10:00
Zoey
d5b8fcaede default host: use temporary redirect 2026-01-25 09:58:01 +01:00
Zoey
923cd457a7 set crowdsec_disable_appsec to 0 by default to fix log spam if disabled 2026-01-25 09:58:01 +01:00
Zoey
39bfd1996d delete cookie if it is invalid 2026-01-25 09:58:01 +01:00
Zoey
73fd056400 switch back to 301 https redirects 2026-01-25 09:58:01 +01:00
Zoey
86d7c57396 dep updates/always create nginx http socket
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-25 09:58:01 +01:00
Zoey
bc1866a00a fix #2594 2026-01-22 11:06:50 +01:00
Zoey
af46fed52e use patch file instead of sed to patch nginx/remove cosmetic changed 2026-01-22 09:53:14 +01:00
Zoey
9b421ddf69 dep updates + lang: streams only support proxy protocol v1 2026-01-22 09:53:14 +01:00
Zsolt Tovis
ef6918947c fix: update (2) Hungarian translations for consistency and clarity.
- Clarification of the translation of action.add-location
2026-01-22 07:49:07 +01:00
dependabot[bot]
2deb5447d6 Bump lodash from 4.17.21 to 4.17.23 in /backend
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.21 to 4.17.23.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.21...4.17.23)

---
updated-dependencies:
- dependency-name: lodash
  dependency-version: 4.17.23
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-21 23:43:45 +00:00
dependabot[bot]
1bb29259ea Bump lodash-es from 4.17.21 to 4.17.23 in /frontend
Bumps [lodash-es](https://github.com/lodash/lodash) from 4.17.21 to 4.17.23.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.21...4.17.23)

---
updated-dependencies:
- dependency-name: lodash-es
  dependency-version: 4.17.23
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-21 23:07:00 +00:00
Zoey
7a067c6892 http-only => no-tls, since streams don't use http 2026-01-20 23:27:18 +01:00
renovate[bot]
f292ef71a6 dep updates/also lint PRs
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-20 22:08:22 +01:00
Zoey
9d82b5b33e auto redirect to oidc if password is disabled 2026-01-20 22:08:22 +01:00
Zoey
da4b052b6d run certbot every six hours 2026-01-20 22:08:22 +01:00
Zoey
bdaddfd797 SSL => TLS in backend logger 2026-01-20 22:08:22 +01:00
Zoey
82af671e25 add buffering and noindex button/enable unzstd 2026-01-20 22:08:21 +01:00
Zoey
7cd9b83611 Streams: add TLS to upstream button 2026-01-20 22:08:21 +01:00
Zoey
db4b1f1cab always return 401 if there is no token sent 2026-01-20 22:08:21 +01:00
Zoey
135b2c7162 try to fix zstd with disabled proxy buffering and add unzstd module
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-20 22:08:21 +01:00
Zoey
6ab1793f80 fix loginAsUser 2026-01-20 22:08:21 +01:00
Zsolt Tovis
fa20c7d8a4 fix: update Hungarian translations for consistency and clarity.
- Fine-tuning of some Hungarian language-specific expressions.
2026-01-20 18:44:40 +01:00
Zsolt Tövis
4ed17fef01 Update frontend/src/locale/src/hu.json
Typo-fix: GitHub

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-01-20 16:56:46 +01:00
Zsolt Tovis
fe316252f1 Add Hungarian language support and help documentation
- Integrated Hungarian translations into the IntlProvider and lang-list.
- Added Hungarian help documentation for various topics including Access Lists, Certificates, Proxy Hosts, and more.
- Updated locale options to include Hungarian language.
2026-01-20 16:38:06 +01:00
Zoey
ebc52fb236 only reduce temporary file log buffering to notice from warn 2026-01-18 14:36:31 +01:00
Zoey
fb5b37b2a4 replace certbot-dns-powerdns plugin #2563 2026-01-18 14:19:20 +01:00
Zoey
b9dbe5ffcb merge upstream 2026-01-18 13:55:43 +01:00
Zoey
eb40b95cf9 Merge remote-tracking branch 'upstream/develop' into develop 2026-01-18 13:26:30 +01:00
jc21
7747db994d Merge pull request #5087 from xJayMorex/update-cloudns
Fixed #4715 by updating certbot-dns-cloudns
2026-01-18 20:00:15 +10:00
jc21
9ffced265b Merge pull request #5038 from orhnplt/feature/turkish-locale
Add Turkish locale and help documentation
2026-01-18 19:59:29 +10:00
Lokowitz
50cf275328 split directories 2026-01-18 07:00:46 +00:00
Lokowitz
7bcc34dea9 add dependabot config 2026-01-18 06:52:30 +00:00
Zoey
b07b0ee8a3 reduce temporary file log buffering to info 2026-01-17 23:53:12 +01:00
Zoey
6053d73a3b readd njs
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-17 22:42:43 +01:00
Zoey
65a5b73396 do not log an error if /etc/letsencrypt is mounted and there are no files to move
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-17 22:29:56 +01:00
Zoey
e210b44039 fix copy/paste misstake in docs and dep updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-17 22:29:56 +01:00
Zoey
d0ea12347a update template version
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-17 12:52:26 +01:00
GitHub
eff216f59e update and lint
Signed-off-by: GitHub <noreply@github.com>
2026-01-17 11:48:23 +00:00
Zoey
9dcadbc230 Refactor bulkGenerateConfigs to use async/await
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-17 12:47:49 +01:00
Zoey
946a9a2448 reduce temporary file buffering error level 2026-01-17 10:52:16 +01:00
Zoey
9192ef5da2 fix totp (#2531)/dep updates 2026-01-17 10:50:25 +01:00
Zoey
8ab731433f update template version to also force host regneration when updating from last update 2026-01-16 21:26:35 +01:00
Zoey
4f209833e7 add new migration to restet button values which now do something else 2026-01-16 20:46:32 +01:00
renovate[bot]
b3e754c0d7 also disable wud/dep updates 2026-01-16 20:46:32 +01:00
Orhan Polat
131e5fea4f fix: remove duplicate locales in lang-list 2026-01-16 12:15:13 +03:00
Orhan Polat
4e412f18bb fix: resolve lint issues in IntlProvider and HelpDoc 2026-01-16 11:59:34 +03:00
Orhan Polat
bb0a50eccb chore: trigger CI 2026-01-16 11:45:34 +03:00
Orhan Polat
4185665570 Add Turkish locale and help documentation 2026-01-16 11:44:18 +03:00
Zoey
864e947d8d update template version 2026-01-15 23:47:13 +01:00
Zoey
f34b1dc535 move html files 2026-01-15 23:39:43 +01:00
renovate[bot]
f07a002244 run certbot every 12 hours by default/dep updates 2026-01-15 22:12:19 +01:00
Zoey
cb10fd155f merge upstream
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-15 16:47:55 +01:00
Zoey
a77895eb1c Merge remote-tracking branch 'upstream/develop' into develop 2026-01-15 13:51:08 +01:00
renovate[bot]
3aa3a06d2d improve docs/dep updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-15 12:55:59 +01:00
jc21
9ea6fee3ce Merge pull request #4930 from blinkerfluessigkeit/lang-de
Update German translations
2026-01-15 09:58:07 +10:00
jc21
7ee9a3c9f0 Merge pull request #4952 from GedasMirak/develop
Add french translation
2026-01-15 09:56:51 +10:00
Zoey
3a6ee422cb only send initials to gravatar 2026-01-14 22:39:07 +01:00
Zoey
9c8cb7396e improve german translation based on upstream PRs 4946 and 4930 2026-01-14 22:39:07 +01:00
renovate[bot]
93bd44f70b dep updates/docs updates
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-14 22:39:07 +01:00
Zoey
21ef78b9b2 merge upstream/dep updates 2026-01-14 17:57:26 +01:00
blinkerfluessigkeit
afb196e5b6 Update German translations 2026-01-14 12:41:41 +01:00
Zoey
022237a7b7 Merge remote-tracking branch 'upstream/develop' into develop 2026-01-14 10:37:53 +01:00
GedasMirak
0b464ac9fd Add french locale 2026-01-14 15:01:33 +10:00
jc21
f3efaae320 Merge pull request #5141 from NginxProxyManager/develop
v2.13.6
2026-01-14 14:30:49 +10:00
jc21
7b3c1fd061 Merge branch 'master' into develop 2026-01-14 13:47:51 +10:00
Jamie Curnow
ee42202348 Bump version 2026-01-14 13:34:17 +10:00
Jamie Curnow
c1ad7788f1 Changed 2fa delete from body to query for code
as per best practices
2026-01-14 13:24:38 +10:00
Jamie Curnow
d33bb02c74 Add missing params to swagger 2026-01-14 12:46:30 +10:00
Jamie Curnow
462c134751 2fa work slight refactor
- use existing access mechanisms for validation
- adds swagger/schema and validation of incoming payload
2026-01-14 11:45:12 +10:00
Zoey
3ca8e35614 mention totp improvements in the readme
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-14 00:33:36 +01:00
Zoey
1c45a26ccc migrate to otplib v13 and render qr code locally 2026-01-14 00:09:41 +01:00
Zoey
30bae8cb4c hotfix: NPMplus uses bcryptjs not bcrypt 2026-01-13 23:03:43 +01:00
Zoey
46b82cdb06 SSL=>TLS, Lets's Encrypt=>Certbot, NPM => NPMplus in HelpDoc 2026-01-13 22:59:48 +01:00
GitHub
7af2253aed update and lint
Signed-off-by: GitHub <noreply@github.com>
2026-01-13 21:42:43 +00:00
Zoey
8c1defb39a Merge remote-tracking branch 'upstream/develop' into develop 2026-01-13 22:40:40 +01:00
Zoey
6dd24cb68b adjust new lang files 2026-01-13 22:38:35 +01:00
jc21
b7dfaddbb1 Merge pull request #4970 from zdzichu6969/develop
Polish Translation Fixes
2026-01-14 07:33:49 +10:00
jc21
11ee4f0820 Merge pull request #4965 from archettitechnology/develop
Update Italian locale message for empty objects
2026-01-14 07:32:07 +10:00
Zoey
f76a7c6dff Merge remote-tracking branch 'upstream/develop' into develop 2026-01-13 22:27:00 +01:00
jc21
19970a4220 Merge pull request #5095 from aindriu80/develop
feat: (i18n) Added Irish translation
2026-01-14 07:26:10 +10:00
Zoey
ec67b04c2f keep brotli enabled when openappsecs attachment module is loaded as they now support brotli
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-13 22:20:21 +01:00
Zoey
d7702102d3 dep updates/fix typo
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-13 21:37:42 +01:00
Zoey
f481792c92 merge upstream/fix 24h format 2026-01-13 21:37:42 +01:00
Zoey
a9b2fd358b Merge remote-tracking branch 'upstream/develop' into develop 2026-01-13 14:43:20 +01:00
Zoey
70da615d44 merge upstream 2026-01-13 14:42:34 +01:00
Zoey
53f7dad829 Merge remote-tracking branch 'upstream/develop' into develop 2026-01-13 14:42:26 +01:00
jc21
59bac3b468 Merge pull request #5005 from NginxProxyManager/dependabot/npm_and_yarn/backend/express-4.22.0
Bump express from 4.21.2 to 4.22.0 in /backend
2026-01-13 23:35:27 +10:00
jc21
48753fb101 Merge pull request #5136 from NginxProxyManager/dependabot/npm_and_yarn/docs/mdast-util-to-hast-13.2.1
Bump mdast-util-to-hast from 13.2.0 to 13.2.1 in /docs
2026-01-13 23:35:13 +10:00
Zoey
aaec311699 merge upstream 2026-01-13 14:30:23 +01:00
dependabot[bot]
2a3978ae3f Bump mdast-util-to-hast from 13.2.0 to 13.2.1 in /docs
Bumps [mdast-util-to-hast](https://github.com/syntax-tree/mdast-util-to-hast) from 13.2.0 to 13.2.1.
- [Release notes](https://github.com/syntax-tree/mdast-util-to-hast/releases)
- [Commits](https://github.com/syntax-tree/mdast-util-to-hast/compare/13.2.0...13.2.1)

---
updated-dependencies:
- dependency-name: mdast-util-to-hast
  dependency-version: 13.2.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-13 13:28:52 +00:00
dependabot[bot]
4ce5da5930 Bump express from 4.21.2 to 4.22.0 in /backend
Bumps [express](https://github.com/expressjs/express) from 4.21.2 to 4.22.0.
- [Release notes](https://github.com/expressjs/express/releases)
- [Changelog](https://github.com/expressjs/express/blob/4.22.0/History.md)
- [Commits](https://github.com/expressjs/express/compare/4.21.2...4.22.0)

---
updated-dependencies:
- dependency-name: express
  dependency-version: 4.22.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-01-13 13:26:06 +00:00
jc21
89d3756ee6 Merge pull request #5118 from mobilandi/develop
Add DNS plugin for All-Inkl provider
2026-01-13 23:19:00 +10:00
Zoey
25204dff5a Merge remote-tracking branch 'upstream/develop' into develop 2026-01-13 14:14:57 +01:00
Jamie Curnow
58c63096e4 Skip color output for vitest in ci 2026-01-13 22:55:19 +10:00
Jamie Curnow
b01a22c393 Fix frontend locale tests after date-fns changed intl formatting
and also attempt to format dates in locale
2026-01-13 22:42:42 +10:00
Zoey
9f8b1755fa Add GNU AGPL v3 license file
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-13 13:24:12 +01:00
Jamie Curnow
9c25410331 Fix locale sort not to use sponge 2026-01-13 22:15:54 +10:00
Strana-Mechty
db72c8b372 Add compression for less.js
Add compression for the less.js MIME, this stylesheet library is used by a few selfhosted applications like Cryptpad and (like regular CSS) is compressible.

Signed-off-by: Strana-Mechty <124194364+Strana-Mechty@users.noreply.github.com>
2026-01-13 12:34:26 +01:00
renovate[bot]
505bc73f52 dep updates/ignor elint push errors
Signed-off-by: Zoey <zoey@z0ey.de>
2026-01-13 12:34:26 +01:00
jc21
b3a901bbc5 Merge pull request #5015 from NginxProxyManager/dependabot/npm_and_yarn/backend/jws-3.2.3
Bump jws from 3.2.2 to 3.2.3 in /backend
2026-01-13 15:18:41 +10:00
jc21
3e3396ba9a Update lang-list.json 2026-01-13 15:05:13 +10:00
jc21
3eb493bb8b Merge pull request #5022 from dupsatou/add-dns-plugin-support-he-ddns
Add Hurricane Electric DDNS plugin configuration
2026-01-13 14:53:51 +10:00
jc21
8c8221a352 Merge pull request #5037 from vtj-mizuno/fix-japanese-translate
Fix Japanese translate
2026-01-13 14:53:07 +10:00
jc21
582681e3ff Merge pull request #5080 from bzuro/develop
Change visibility to permission_visibility in report.js
2026-01-13 14:52:45 +10:00
jc21
52fae6d35f Merge pull request #5084 from lacamera/security/CVE-2025-55182
security: bump react to 19.2.3 to fix CVE-2025-55182 (#5020)
2026-01-13 14:50:39 +10:00
jc21
6c0ea835ce Merge branch 'develop' into develop 2026-01-13 14:46:35 +10:00
jc21
fb52655374 Merge pull request #5103 from CamelT0E/develop
Update German locale message from 'German' to 'Deutsch'
2026-01-13 14:43:42 +10:00
Jamie Curnow
336726db8d Backend yarn lock updates 2026-01-13 14:40:10 +10:00
jc21
4a7853163e Merge pull request #5107 from teguh02/develop
feat(i18n): add Bahasa Indonesia translations and help documentation
2026-01-13 14:32:18 +10:00
jc21
b30f8e47e2 Merge pull request #5109 from piotrfx/develop
Add TOTP-based two-factor authentication
2026-01-13 14:30:48 +10:00
jc21
6fa30840be Merge pull request #5114 from Shotz5/develop
Added logging for streams based on port
2026-01-13 14:18:13 +10:00
jc21
05726aaab9 Merge pull request #5119 from manisto/develop
Added support for DNS challenges with Simply.com
2026-01-13 14:14:38 +10:00
jc21
f85bb79f13 Merge pull request #5121 from KalebCheng/feature/certificate-key-type-selection
Add option to select RSA or ECDSA key type when creating certificates
2026-01-13 14:13:22 +10:00
kk.cheng
471b62c7fe Add option to select RSA or ECDSA key type when creating certificates 2026-01-07 19:13:12 +08:00
Zoey
c5db4f4f7c fix sorting of certbot dns providers 2026-01-05 21:44:07 +01:00
Gert Rue Brigsted
55a1e0a4e7 Added support for DNS challenges with Simply.com 2026-01-04 21:50:47 +01:00
mobilandi
f25afa3590 Change version constraint for certbot-dns-kas 2026-01-03 23:08:34 +01:00
mobilandi
9211ba6d1a Add DNS plugin for All-Inkl provider 2026-01-03 23:06:25 +01:00
Alex Kitsul
aeb44244a7 Added logging for streams based on port 2025-12-30 21:44:29 -08:00
piotrfx
d2d204ab8e Trigger CI 2025-12-28 12:04:35 +01:00
piotrfx
427afa55b4 Add TOTP-based two-factor authentication
- Add 2FA setup, enable, disable, and backup code management
- Integrate 2FA challenge flow into login process
- Add frontend modal for 2FA configuration
- Support backup codes for account recovery
2025-12-28 11:58:30 +01:00
Teguh Rijanandi
bbe98a639a Add Indonesian locale and help docs 2025-12-27 22:35:17 +07:00
Aindriú Mac Giolla Eoin
f0c0b465d9 Removiving 0x200b - Zero width space 2025-12-20 17:53:05 +00:00
Aindriú Mac Giolla Eoin
6c2f6a9d39 Fixing plural/iolra issue 2025-12-19 11:43:18 +00:00
Aindriú Mac Giolla Eoin
2f6e3ad804 Added Irish translation 2025-12-18 18:21:14 +00:00
John Taylor
c9f453714b Fixed #4715 by updating certbot-dns-cloudns 2025-12-15 17:03:29 +01:00
Francesco La Camera
5e6ead1eee security: bump react to 19.2.3 to fix CVE-2025-55182 (#5020) 2025-12-15 09:54:18 +01:00
bzuro
da519e72ba Change visibility to permission_visibility in report.js
fix for issue #2014
when even administrator with all_items visibility got 0 proxy hosts in dashboard.
2025-12-14 00:35:22 +01:00
Hajime MIZUNO
b13ebb2247 Fix Japanese translate 2025-12-10 23:28:53 +09:00
dupsatou
6b322582b9 Add Hurricane Electric DDNS plugin configuration
Add support for dns verification using Hurricane Electric DDNS credentials as a more secure way over account root credentials.  More information available here: https://github.com/mafredri/certbot-dns-he-ddns
2025-12-08 09:45:11 -06:00
angioletto
7fe5070337 Merge branch 'NginxProxyManager:develop' into develop 2025-12-06 14:56:52 +01:00
CamelT0E
1b8f1fbb79 Update German locale message from 'German' to 'Deutsch' 2025-12-06 01:30:56 +01:00
dependabot[bot]
4abea1247d Bump jws from 3.2.2 to 3.2.3 in /backend
Bumps [jws](https://github.com/brianloveswords/node-jws) from 3.2.2 to 3.2.3.
- [Release notes](https://github.com/brianloveswords/node-jws/releases)
- [Changelog](https://github.com/auth0/node-jws/blob/master/CHANGELOG.md)
- [Commits](https://github.com/brianloveswords/node-jws/compare/v3.2.2...v3.2.3)

---
updated-dependencies:
- dependency-name: jws
  dependency-version: 3.2.3
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-12-04 16:58:07 +00:00
Mateusz Gruszczyński
073ee95e56 change 2025-12-02 12:57:09 +01:00
Mateusz Gruszczyński
168078eb40 changes 2025-11-26 10:54:30 +01:00
Mateusz Gruszczyński
2c9f8f4d64 changes 2025-11-26 10:50:41 +01:00
Mateusz Gruszczyński
8403a0c761 changes 2025-11-26 10:42:48 +01:00
angioletto
927e57257b Merge branch 'NginxProxyManager:develop' into develop 2025-11-21 17:03:47 +01:00
Mateusz Gruszczyński
56875bba52 pretty :) 2025-11-19 21:23:23 +01:00
Mateusz Gruszczyński
b55f51bd63 fixes1 in pl 2025-11-19 15:10:56 +01:00
Mateusz Gruszczyński
86b7394620 fixes1 2025-11-19 11:01:25 +01:00
Mateusz Gruszczyński
91a1f39c02 fixes1 2025-11-19 10:53:55 +01:00
angioletto
5c114e9db7 Update Italian locale message for empty objects
Wrong translation of line 431
2025-11-19 09:56:05 +01:00
Mateusz Gruszczyński
fec9bffe29 fixes1 2025-11-19 09:13:55 +01:00
jc21
847c58b170 Merge pull request #4956 from NginxProxyManager/develop
v2.13.5
2025-11-18 21:13:24 +10:00
331 changed files with 15111 additions and 2891 deletions

View File

@@ -13,7 +13,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Read version
id: version
run: echo "version=$(cat caddy/Dockerfile | grep "^COPY --from=caddy:.*$" | head -1 | sed "s|COPY --from=caddy:\([0-9.]\+\).*|\1|g")" >> $GITHUB_OUTPUT

View File

@@ -13,28 +13,28 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
uses: docker/setup-qemu-action@c7c53464625b32c7a7e944ae62b3e17d2b600130 # v3
with:
platforms: all
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to DockerHub
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
password: ${{ github.token }}
- name: Build
uses: docker/build-push-action@v6
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: caddy
platforms: linux/amd64,linux/arm64

View File

@@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update nginx version
id: update
run: |
@@ -26,7 +26,7 @@ jobs:
sed -i "s|ARG NGINX_VER=.*|ARG NGINX_VER=$NGINX_VER|" Dockerfile
echo "version=$NGINX_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -39,24 +39,22 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update dynamic_tls_records version
id: update
run: |
git clone https://github.com/nginx-modules/ngx_http_tls_dyn_size ngx_http_tls_dyn_size
git clone --depth 1 https://github.com/nginx-modules/ngx_http_tls_dyn_size ngx_http_tls_dyn_size
DTR_VER="$(
ls ngx_http_tls_dyn_size/nginx__dynamic_tls_records_*.patch \
| sed "s|ngx_http_tls_dyn_size/nginx__dynamic_tls_records_\([0-9.]\+\)+.patch|\1|g" \
| sort -V \
| grep -v rc \
| tail -1 \
| sed "s|\^{}||g"
| tail -1
)"
rm -r ngx_http_tls_dyn_size
sed -i "s|ARG DTR_VER=.*|ARG DTR_VER=$DTR_VER|" Dockerfile
echo "version=$DTR_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -68,22 +66,21 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update resolver_conf_parsing version
id: update
run: |
git clone https://github.com/openresty/openresty openresty
git clone --depth 1 https://github.com/openresty/openresty openresty
RCP_VER="$(
ls openresty/patches/nginx \
| sort -V \
| tail -1 \
| sed "s|\^{}||g"
| tail -1
)"
rm -r openresty
sed -i "s|ARG RCP_VER=.*|ARG RCP_VER=$RCP_VER|" Dockerfile
echo "version=$RCP_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -91,12 +88,39 @@ jobs:
branch: update-resolver_conf_parsing-version
title: update resolver_conf_parsing version to ${{ steps.update.outputs.version }}
body: update resolver_conf_parsing version to ${{ steps.update.outputs.version }}
zlib-ng-patch-update:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update zlib-ng-patch version
id: update
run: |
git clone --depth 1 https://github.com/zlib-ng/patches zlib-ng-patches
ZNP_VER="$(
ls zlib-ng-patches/nginx/*-zlib-ng.patch \
| sed "s|zlib-ng-patches/nginx/\([0-9.]\+\)-zlib-ng.patch|\1|g" \
| sort -V \
| tail -1
)"
rm -r zlib-ng-patches
sed -i "s|ARG ZNP_VER=.*|ARG ZNP_VER=$ZNP_VER|" Dockerfile
echo "version=$ZNP_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
commit-message: update zlib-ng-patch version to ${{ steps.update.outputs.version }}
branch: update-zlib-ng-patch-version
title: update zlib-ng-patch version to ${{ steps.update.outputs.version }}
body: update zlib-ng-patch version to ${{ steps.update.outputs.version }}
ngx_brotli-update:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update ngx_brotli version
id: update
run: |
@@ -111,7 +135,7 @@ jobs:
sed -i "s|ARG NB_VER=.*|ARG NB_VER=$NB_VER|" Dockerfile
echo "version=$NB_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != '' }}
with:
signoff: true
@@ -124,12 +148,12 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update ngx_unbrotli version
id: update
run: |
NUB_VER="$(
git ls-remote --tags https://github.com/clyfish/ngx_unbrotli \
git ls-remote --tags https://github.com/clyfish/ngx_unbrotli \
| cut -d/ -f3 \
| sort -V \
| grep -v rc \
@@ -139,7 +163,7 @@ jobs:
sed -i "s|ARG NUB_VER=.*|ARG NUB_VER=$NUB_VER|" Dockerfile
echo "version=$NUB_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != '' }}
with:
signoff: true
@@ -152,7 +176,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update zstd-nginx-module version
id: update
run: |
@@ -167,7 +191,7 @@ jobs:
sed -i "s|ARG ZNM_VER=.*|ARG ZNM_VER=$ZNM_VER|" Dockerfile
echo "version=$ZNM_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != '0.1.1' }}
with:
signoff: true
@@ -176,11 +200,39 @@ jobs:
branch: update-zstd-nginx-module-version
title: update zstd-nginx-module version to ${{ steps.update.outputs.version }}
body: update zstd-nginx-module version to ${{ steps.update.outputs.version }}
ngx_http_unzstd_filter_module-update:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update ngx_http_unzstd_filter_module version
id: update
run: |
NHUZFM_VER="$(
git ls-remote --tags https://github.com/HanadaLee/ngx_http_unzstd_filter_module \
| cut -d/ -f3 \
| sort -V \
| grep -v rc \
| tail -1 \
| sed "s|\^{}||g"
)"
sed -i "s|ARG NHUZFM_VER=.*|ARG NHUZFM_VER=$NHUZFM_VER|" Dockerfile
echo "version=$NHUZFM_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != '' }}
with:
signoff: true
delete-branch: true
commit-message: update ngx_http_unzstd_filter_module version to ${{ steps.update.outputs.version }}
branch: update-ngx_http_unzstd_filter_module-version
title: update ngx_http_unzstd_filter_module version to ${{ steps.update.outputs.version }}
body: update ngx_http_unzstd_filter_module version to ${{ steps.update.outputs.version }}
ngx-fancyindex-update:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update ngx-fancyindex version
id: update
run: |
@@ -195,7 +247,7 @@ jobs:
sed -i "s|ARG NF_VER=.*|ARG NF_VER=$NF_VER|" Dockerfile
echo "version=$NF_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != 'v0.5.2' }}
with:
signoff: true
@@ -208,7 +260,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update headers-more-nginx-module version
id: update
run: |
@@ -223,7 +275,7 @@ jobs:
sed -i "s|ARG HMNM_VER=.*|ARG HMNM_VER=$HMNM_VER|" Dockerfile
echo "version=$HMNM_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -235,7 +287,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update ngx_devel_kit version
id: update
run: |
@@ -250,7 +302,7 @@ jobs:
sed -i "s|ARG NDK_VER=.*|ARG NDK_VER=$NDK_VER|" Dockerfile
echo "version=$NDK_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -262,7 +314,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update lua-nginx-module version
id: update
run: |
@@ -277,7 +329,7 @@ jobs:
sed -i "s|ARG LNM_VER=.*|ARG LNM_VER=$LNM_VER|" Dockerfile
echo "version=$LNM_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -286,11 +338,39 @@ jobs:
title: update lua-nginx-module version to ${{ steps.update.outputs.version }}
body: update lua-nginx-module version to ${{ steps.update.outputs.version }}
njs-update:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update njs version
id: update
run: |
NJS_VER="$(
git ls-remote --tags https://github.com/nginx/njs \
| cut -d/ -f3 \
| sort -V \
| grep -v rc \
| tail -1 \
| sed "s|\^{}||g"
)"
sed -i "s|ARG NJS_VER=.*|ARG NJS_VER=$NJS_VER|" Dockerfile
echo "version=$NJS_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
commit-message: update njs version to ${{ steps.update.outputs.version }}
branch: update-njs-version
title: update njs version to ${{ steps.update.outputs.version }}
body: update njs version to ${{ steps.update.outputs.version }}
nginx-auth-ldap-update:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update nginx-auth-ldap version
id: update
run: |
@@ -305,7 +385,7 @@ jobs:
sed -i "s|ARG NAL_VER=.*|ARG NAL_VER=$NAL_VER|" Dockerfile
echo "version=$NAL_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != 'v0.1' }}
with:
signoff: true
@@ -318,7 +398,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update vts version
id: update
run: |
@@ -333,7 +413,7 @@ jobs:
sed -i "s|ARG VTS_VER=.*|ARG VTS_VER=$VTS_VER|" Dockerfile
echo "version=$VTS_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -345,7 +425,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update nginx-ntlm-module version
id: update
run: |
@@ -360,7 +440,7 @@ jobs:
sed -i "s|ARG NNTLM_VER=.*|ARG NNTLM_VER=$NNTLM_VER|" Dockerfile
echo "version=$NNTLM_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
if: ${{ steps.update.outputs.version != 'v1.19.3-beta.1' }}
with:
signoff: true
@@ -373,7 +453,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update ngx_http_geoip2_module version
id: update
run: |
@@ -388,7 +468,7 @@ jobs:
sed -i "s|ARG NHG2M_VER=.*|ARG NHG2M_VER=$NHG2M_VER|" Dockerfile
echo "version=$NHG2M_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -401,7 +481,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update lua-resty-core version
id: update
run: |
@@ -416,7 +496,7 @@ jobs:
sed -i "s|ARG LRC_VER=.*|ARG LRC_VER=$LRC_VER|" Dockerfile
echo "version=$LRC_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -428,7 +508,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update lua-resty-lrucache version
id: update
run: |
@@ -443,7 +523,7 @@ jobs:
sed -i "s|ARG LRL_VER=.*|ARG LRL_VER=$LRL_VER|" Dockerfile
echo "version=$LRL_VER" >> $GITHUB_OUTPUT
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true
@@ -456,7 +536,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: update lua-cs-bouncer version
id: update
run: |
@@ -473,7 +553,7 @@ jobs:
wget https://raw.githubusercontent.com/crowdsecurity/cs-nginx-bouncer/refs/heads/main/nginx/crowdsec_nginx.conf -O rootfs/usr/local/nginx/conf/conf.d/crowdsec.conf.original
wget https://raw.githubusercontent.com/crowdsecurity/lua-cs-bouncer/refs/tags/"$LCSB_VER"/config_example.conf -O rootfs/etc/crowdsec.conf.original
- name: Create Pull Request
uses: peter-evans/create-pull-request@v8
uses: peter-evans/create-pull-request@c0f553fe549906ede9cf27b5156039d195d2ece0 # v8
with:
signoff: true
delete-branch: true

View File

@@ -12,13 +12,13 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
@@ -28,7 +28,7 @@ jobs:
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" frontend/package.json
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" backend/package.json
- name: Build
uses: docker/build-push-action@v6
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: .
file: ./Dockerfile
@@ -42,13 +42,13 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
@@ -58,7 +58,7 @@ jobs:
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" frontend/package.json
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" backend/package.json
- name: Build
uses: docker/build-push-action@v6
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: .
file: ./Dockerfile
@@ -73,12 +73,12 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Login to DockerHub
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid

View File

@@ -12,13 +12,13 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
@@ -28,7 +28,7 @@ jobs:
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" frontend/package.json
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" backend/package.json
- name: Build
uses: docker/build-push-action@v6
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: .
file: ./Dockerfile
@@ -42,13 +42,13 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
@@ -58,7 +58,7 @@ jobs:
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" frontend/package.json
sed -i "s|\"0.0.0\"|\"${{ inputs.tag }}-$(git rev-parse --short HEAD)-$(cat .version)\"|g" backend/package.json
- name: Build
uses: docker/build-push-action@v6
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: .
file: ./Dockerfile
@@ -73,12 +73,12 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Login to DockerHub
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid

View File

@@ -3,7 +3,6 @@ on:
push:
branches:
- develop
pull_request:
workflow_dispatch:
jobs:
build-x86_64:
@@ -11,13 +10,13 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
@@ -27,8 +26,7 @@ jobs:
sed -i "s|\"0.0.0\"|\"$(git rev-parse --short HEAD)\"|g" frontend/package.json
sed -i "s|\"0.0.0\"|\"$(git rev-parse --short HEAD)\"|g" backend/package.json
- name: Build
uses: docker/build-push-action@v6
if: ${{ github.event_name != 'pull_request' }}
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: .
file: ./Dockerfile
@@ -36,33 +34,19 @@ jobs:
tags: ghcr.io/zoeyvid/npmplus:develop-x86_64
build-args: |
FLAGS=-march=x86-64-v2 -mtune=generic -fcf-protection=full
- name: Set PR-Number (PR)
if: ${{ github.event_name == 'pull_request' }}
id: pr
run: echo "pr=$(echo pr-develop | sed "s|refs/pull/:||g" | sed "s|/merge||g")" >> $GITHUB_OUTPUT
- name: Build (PR)
uses: docker/build-push-action@v6
if: ${{ github.event_name == 'pull_request' }}
with:
context: .
file: ./Dockerfile
push: true
tags: ghcr.io/zoeyvid/npmplus:${{ steps.pr.outputs.pr }}-x86_64
build-args: |
FLAGS=-march=x86-64-v2 -mtune=generic -fcf-protection=full
build-aarch64:
runs-on: ubuntu-24.04-arm
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3
with:
driver-opts: env.BUILDKIT_STEP_LOG_MAX_SIZE=-1
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
@@ -72,8 +56,7 @@ jobs:
sed -i "s|\"0.0.0\"|\"$(git rev-parse --short HEAD)\"|g" frontend/package.json
sed -i "s|\"0.0.0\"|\"$(git rev-parse --short HEAD)\"|g" backend/package.json
- name: Build
uses: docker/build-push-action@v6
if: ${{ github.event_name != 'pull_request' }}
uses: docker/build-push-action@10e90e3645eae34f1e60eeb005ba3a3d33f178e8 # v6
with:
context: .
file: ./Dockerfile
@@ -81,20 +64,6 @@ jobs:
tags: ghcr.io/zoeyvid/npmplus:develop-aarch64
build-args: |
FLAGS=-mbranch-protection=standard
- name: Set PR-Number (PR)
if: ${{ github.event_name == 'pull_request' }}
id: pr
run: echo "pr=$(echo pr-develop | sed "s|refs/pull/:||g" | sed "s|/merge||g")" >> $GITHUB_OUTPUT
- name: Build (PR)
uses: docker/build-push-action@v6
if: ${{ github.event_name == 'pull_request' }}
with:
context: .
file: ./Dockerfile
push: true
tags: ghcr.io/zoeyvid/npmplus:${{ steps.pr.outputs.pr }}-aarch64
build-args: |
FLAGS=-mbranch-protection=standard
merge:
runs-on: ubuntu-latest
@@ -102,33 +71,17 @@ jobs:
if: ${{ github.repository_owner == 'ZoeyVid' }}
steps:
- name: Login to DockerHub
if: ${{ github.event_name != 'pull_request' }}
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
uses: docker/login-action@c94ce9fb468520275223c153574b00df6fe4bcc9 # v3
with:
registry: ghcr.io
username: zoeyvid
password: ${{ github.token }}
- name: create multiarch
if: ${{ github.event_name != 'pull_request' }}
run: |
docker buildx imagetools create --tag zoeyvid/npmplus:develop ghcr.io/zoeyvid/npmplus:develop-x86_64 ghcr.io/zoeyvid/npmplus:develop-aarch64
docker buildx imagetools create --tag ghcr.io/zoeyvid/npmplus:develop ghcr.io/zoeyvid/npmplus:develop-x86_64 ghcr.io/zoeyvid/npmplus:develop-aarch64
- name: Set PR-Number (PR)
if: ${{ github.event_name == 'pull_request' }}
id: pr
run: echo "pr=$(echo pr-develop | sed "s|refs/pull/:||g" | sed "s|/merge||g")" >> $GITHUB_OUTPUT
- name: create multiarch (PR)
if: ${{ github.event_name == 'pull_request' }}
run: docker buildx imagetools create --tag ghcr.io/zoeyvid/npmplus:${{ steps.pr.outputs.pr }} ghcr.io/zoeyvid/npmplus:${{ steps.pr.outputs.pr }}-x86_64 ghcr.io/zoeyvid/npmplus:${{ steps.pr.outputs.pr }}-aarch64
- name: add comment (PR)
uses: mshick/add-pr-comment@v2
if: ${{ github.event_name == 'pull_request' }}
with:
message: "The Docker Image can now be found here: `ghcr.io/zoeyvid/npmplus:${{ steps.pr.outputs.pr }}`"
repo-token: ${{ github.token }}
refresh-message-position: true

View File

@@ -11,7 +11,7 @@ jobs:
name: docker-lint
steps:
- name: Checkout
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Install hadolint
run: |
sudo wget https://github.com/hadolint/hadolint/releases/latest/download/hadolint-Linux-x86_64 -O /usr/bin/hadolint

View File

@@ -9,8 +9,8 @@ jobs:
test-json:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: json-syntax-check
uses: limitusus/json-syntax-check@v2
uses: limitusus/json-syntax-check@77d5756026b93886eaa3dc6ca1c4b17dd19dc703 # v2
with:
pattern: "\\.json"

View File

@@ -3,17 +3,18 @@ on:
push:
branches:
- develop
pull_request:
workflow_dispatch:
jobs:
lint-and-format:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v6
- uses: actions/setup-node@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # v6
with:
node-version: lts/*
- uses: pnpm/action-setup@v4
- uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4
with:
version: latest
- name: install-sponge
@@ -42,5 +43,5 @@ jobs:
git add -A
git config user.name "GitHub"
git config user.email "noreply@github.com"
git diff-index --quiet HEAD || git commit -sm "update and lint"
git push
git commit -sm "update and lint" || true
git push || true

View File

@@ -10,7 +10,7 @@ jobs:
name: Check Shell
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Run Shellcheck
uses: ludeeus/action-shellcheck@master
with:

View File

@@ -11,9 +11,9 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Check out code.
uses: actions/checkout@v6
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6
- name: Check spelling
uses: codespell-project/actions-codespell@v2
uses: codespell-project/actions-codespell@406322ec52dd7b488e48c1c4b82e2a8b3a1bf630 # v2
with:
check_filenames: true
check_hidden: true

View File

@@ -1,6 +0,0 @@
{
"schedule": "daily",
"aggressiveCompression": "true",
"compressWiki": "true",
"minKBReduced": 0
}

View File

@@ -1 +1 @@
2.13.5
2.14.0

661
COPYING Normal file
View File

@@ -0,0 +1,661 @@
GNU AFFERO GENERAL PUBLIC LICENSE
Version 3, 19 November 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The GNU Affero General Public License is a free, copyleft license for
software and other kinds of works, specifically designed to ensure
cooperation with the community in the case of network server software.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
our General Public Licenses are intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
Developers that use our General Public Licenses protect your rights
with two steps: (1) assert copyright on the software, and (2) offer
you this License which gives you legal permission to copy, distribute
and/or modify the software.
A secondary benefit of defending all users' freedom is that
improvements made in alternate versions of the program, if they
receive widespread use, become available for other developers to
incorporate. Many developers of free software are heartened and
encouraged by the resulting cooperation. However, in the case of
software used on network servers, this result may fail to come about.
The GNU General Public License permits making a modified version and
letting the public access it on a server without ever releasing its
source code to the public.
The GNU Affero General Public License is designed specifically to
ensure that, in such cases, the modified source code becomes available
to the community. It requires the operator of a network server to
provide the source code of the modified version running there to the
users of that server. Therefore, public use of a modified version, on
a publicly accessible server, gives the public access to the source
code of the modified version.
An older license, called the Affero General Public License and
published by Affero, was designed to accomplish similar goals. This is
a different license, not a version of the Affero GPL, but Affero has
released a new version of the Affero GPL which permits relicensing under
this license.
The precise terms and conditions for copying, distribution and
modification follow.
TERMS AND CONDITIONS
0. Definitions.
"This License" refers to version 3 of the GNU Affero General Public License.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
A "covered work" means either the unmodified Program or a work based
on the Program.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
1. Source Code.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The Corresponding Source for a work in source code form is that
same work.
2. Basic Permissions.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
13. Remote Network Interaction; Use with the GNU General Public License.
Notwithstanding any other provision of this License, if you modify the
Program, your modified version must prominently offer all users
interacting with it remotely through a computer network (if your version
supports such interaction) an opportunity to receive the Corresponding
Source of your version by providing access to the Corresponding Source
from a network server at no charge, through some standard or customary
means of facilitating copying of software. This Corresponding Source
shall include the Corresponding Source for any work covered by version 3
of the GNU General Public License that is incorporated pursuant to the
following paragraph.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the work with which it is combined will remain governed by version
3 of the GNU General Public License.
14. Revised Versions of this License.
The Free Software Foundation may publish revised and/or new versions of
the GNU Affero General Public License from time to time. Such new versions
will be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU Affero General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU Affero General Public License, you may choose any version ever published
by the Free Software Foundation.
If the Program specifies that a proxy can decide which future
versions of the GNU Affero General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
How to Apply These Terms to Your New Programs
If you develop a new program, and you want it to be of the greatest
possible use to the public, the best way to achieve this is to make it
free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU Affero General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU Affero General Public License for more details.
You should have received a copy of the GNU Affero General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If your software can interact with users remotely through a computer
network, you should also make sure that it provides a way for users to
get its source. For example, if your program is a web application, its
interface could display a "Source" link that leads users to an archive
of the code. There are many ways you could offer source, and different
solutions will be better for different programs; see section 13 for the
specific requirements.
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU AGPL, see
<https://www.gnu.org/licenses/>.

View File

@@ -1,22 +1,25 @@
# syntax=docker/dockerfile:labs
FROM alpine:3.23.2 AS nginx
FROM alpine:3.23.3 AS nginx
SHELL ["/bin/ash", "-eo", "pipefail", "-c"]
ARG LUAJIT_INC=/usr/include/luajit-2.1
ARG LUAJIT_LIB=/usr/lib
ARG NGINX_VER=release-1.29.4
ARG NGINX_VER=release-1.29.5
ARG DTR_VER=1.29.2
ARG RCP_VER=1.29.4
ARG ZNP_VER=1.26.3
ARG NB_VER=master
ARG NUB_VER=main
ARG ZNM_VER=master
ARG NF_VER=master
ARG NHUZFM_VER=main
ARG NF_VER=v0.6.0
ARG HMNM_VER=v0.39
ARG NDK_VER=v0.3.4
ARG LNM_VER=v0.10.29R2
ARG NJS_VER=0.9.5
ARG NAL_VER=master
ARG VTS_VER=v0.2.5
ARG NNTLM_VER=master
@@ -24,19 +27,17 @@ ARG NHG2M_VER=3.4
ARG FLAGS
ARG CC=clang
ARG CFLAGS="$FLAGS -m64 -O3 -pipe -flto=thin -fstack-clash-protection -fstack-protector-strong -ftrivial-auto-var-init=zero -fno-delete-null-pointer-checks -fno-strict-overflow -fno-strict-aliasing -fno-plt -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=3 -Wformat=2 -Werror=format-security -Wno-sign-compare"
ARG CFLAGS="$FLAGS -m64 -O3 -pipe -flto=full -fstack-clash-protection -fstack-protector-strong -ftrivial-auto-var-init=zero -fno-delete-null-pointer-checks -fno-strict-overflow -fno-strict-aliasing -fno-plt -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=3 -Wformat=2 -Werror=format-security -Wno-sign-compare"
ARG CXX=clang++
ARG CXXFLAGS="$FLAGS -m64 -O3 -pipe -flto=thin -fstack-clash-protection -fstack-protector-strong -ftrivial-auto-var-init=zero -fno-delete-null-pointer-checks -fno-strict-overflow -fno-strict-aliasing -fno-plt -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=3 -D_GLIBCXX_ASSERTIONS -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS=1 -D_LIBCPP_HARDENING_MODE=_LIBCPP_HARDENING_MODE_FAST -Wformat=2 -Werror=format-security -Wno-sign-compare"
ARG LDFLAGS="-fuse-ld=lld -m64 -Wl,-s -Wl,-O1 -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now -Wl,--sort-common -Wl,--as-needed -Wl,-z,pack-relative-relocs"
ARG CXXFLAGS="$FLAGS -m64 -O3 -pipe -flto=full -fstack-clash-protection -fstack-protector-strong -ftrivial-auto-var-init=zero -fno-delete-null-pointer-checks -fno-strict-overflow -fno-strict-aliasing -fno-plt -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=3 -D_GLIBCXX_ASSERTIONS -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS=1 -D_LIBCPP_HARDENING_MODE=_LIBCPP_HARDENING_MODE_FAST -Wformat=2 -Werror=format-security -Wno-sign-compare"
ARG LDFLAGS="-fuse-ld=lld -m64 -Wl,-s -Wl,-O2 -Wl,-z,noexecstack -Wl,-z,relro -Wl,-z,now -Wl,--sort-common -Wl,--as-needed -Wl,-z,pack-relative-relocs"
COPY nginx/ngx_brotli.patch /src/ngx_brotli.patch
COPY nginx/ngx_unbrotli.patch /src/ngx_unbrotli.patch
COPY nginx/zstd-nginx-module.patch /src/zstd-nginx-module.patch
COPY nginx/attachment.patch /src/attachment.patch
WORKDIR /src
COPY patches/*.patch /src
RUN apk upgrade --no-cache -a && \
apk add --no-cache ca-certificates build-base clang lld cmake ninja git \
linux-headers libatomic_ops-dev luajit-dev pcre2-dev zlib-dev brotli-dev zstd-dev openssl-dev geoip-dev libmaxminddb-dev openldap-dev
apk add --no-cache git make clang lld cmake ninja file \
linux-headers libatomic_ops-dev aws-lc aws-lc-dev pcre2-dev luajit-dev zlib-ng-dev brotli-dev zstd-dev libxslt-dev openldap-dev quickjs-ng-dev libmaxminddb-dev clang-dev
RUN git clone --depth 1 https://github.com/nginx/nginx --branch "$NGINX_VER" /src/nginx && \
cd /src/nginx && \
@@ -46,9 +47,9 @@ RUN git clone --depth 1 https://github.com/nginx/nginx --branch "$NGINX_VER" /sr
git apply /src/nginx/2.patch && \
wget -q https://patch-diff.githubusercontent.com/raw/nginx/nginx/pull/689.patch -O /src/nginx/3.patch && \
git apply /src/nginx/3.patch && \
sed -i "s|nginx/|NPMplus/|g" /src/nginx/src/core/nginx.h && \
sed -i "s|Server: nginx|Server: NPMplus|g" /src/nginx/src/http/ngx_http_header_filter_module.c && \
sed -i "/<hr><center>/d" /src/nginx/src/http/ngx_http_special_response.c && \
wget -q https://raw.githubusercontent.com/zlib-ng/patches/refs/heads/master/nginx/"$ZNP_VER"-zlib-ng.patch -O /src/nginx/4.patch && \
git apply /src/nginx/4.patch && \
git apply /src/nginx.patch && \
\
git clone --depth 1 https://github.com/google/ngx_brotli --branch "$NB_VER" /src/ngx_brotli && \
cd /src/ngx_brotli && \
@@ -59,13 +60,19 @@ RUN git clone --depth 1 https://github.com/nginx/nginx --branch "$NGINX_VER" /sr
git clone --depth 1 https://github.com/tokers/zstd-nginx-module --branch "$ZNM_VER" /src/zstd-nginx-module && \
cd /src/zstd-nginx-module && \
wget -q https://patch-diff.githubusercontent.com/raw/tokers/zstd-nginx-module/pull/44.patch -O /src/zstd-nginx-module/1.patch && \
wget -q https://patch-diff.githubusercontent.com/raw/tokers/zstd-nginx-module/pull/23.patch -O /src/zstd-nginx-module/2.patch && \
git apply /src/zstd-nginx-module.patch && \
git apply /src/zstd-nginx-module/1.patch && \
git clone --depth 1 https://github.com/Zoey2936/ngx-fancyindex --branch "$NF_VER" /src/ngx-fancyindex && \
git apply /src/zstd-nginx-module/2.patch && \
git clone --depth 1 https://github.com/HanadaLee/ngx_http_unzstd_filter_module --branch "$NHUZFM_VER" /src/ngx_http_unzstd_filter_module && \
git clone --depth 1 https://github.com/aperezdc/ngx-fancyindex --branch "$NF_VER" /src/ngx-fancyindex && \
git clone --depth 1 https://github.com/openresty/headers-more-nginx-module --branch "$HMNM_VER" /src/headers-more-nginx-module && \
git clone --depth 1 https://github.com/vision5/ngx_devel_kit --branch "$NDK_VER" /src/ngx_devel_kit && \
git clone --depth 1 https://github.com/openresty/lua-nginx-module --branch "$LNM_VER" /src/lua-nginx-module && \
cd /src/lua-nginx-module && \
git apply /src/lua-nginx-module.patch && \
\
git clone --depth 1 https://github.com/nginx/njs --branch "$NJS_VER" /src/njs && \
git clone --depth 1 https://github.com/kvspb/nginx-auth-ldap --branch "$NAL_VER" /src/nginx-auth-ldap && \
git clone --depth 1 https://github.com/vozlt/nginx-module-vts --branch "$VTS_VER" /src/nginx-module-vts && \
git clone --depth 1 https://github.com/gabihodoroaga/nginx-ntlm-module --branch "$NNTLM_VER" /src/nginx-ntlm-module && \
@@ -73,7 +80,7 @@ RUN git clone --depth 1 https://github.com/nginx/nginx --branch "$NGINX_VER" /sr
RUN cd /src/nginx && \
/src/nginx/auto/configure \
--build=nginx \
--build=NPMplus \
--with-debug \
--with-compat \
--with-threads \
@@ -100,12 +107,12 @@ RUN cd /src/nginx && \
--add-module=/src/ngx_brotli \
--add-module=/src/ngx_unbrotli \
--add-module=/src/zstd-nginx-module \
--add-module=/src/ngx_http_unzstd_filter_module \
--add-module=/src/ngx-fancyindex \
--add-module=/src/headers-more-nginx-module \
--add-module=/src/ngx_devel_kit \
--add-module=/src/lua-nginx-module \
--with-http_geoip_module=dynamic \
--with-stream_geoip_module=dynamic \
--add-dynamic-module=/src/njs/nginx \
--add-dynamic-module=/src/nginx-auth-ldap \
--add-dynamic-module=/src/nginx-module-vts \
--add-dynamic-module=/src/nginx-ntlm-module \
@@ -135,7 +142,7 @@ RUN find /usr/local/nginx/modules -name "*.so" -exec strip -s {} \; && \
/usr/local/nginx/sbin/nginx -V
FROM --platform="$BUILDPLATFORM" alpine:3.23.2 AS frontend
FROM --platform="$BUILDPLATFORM" alpine:3.23.3 AS frontend
SHELL ["/bin/ash", "-eo", "pipefail", "-c"]
ARG NODE_ENV=production
COPY frontend /app
@@ -147,7 +154,7 @@ RUN apk upgrade --no-cache -a && \
pnpm tsc && \
pnpm vite build
FROM alpine:3.23.2 AS backend
FROM alpine:3.23.3 AS backend
SHELL ["/bin/ash", "-eo", "pipefail", "-c"]
ARG NODE_ENV=production
COPY backend /app
@@ -162,7 +169,7 @@ RUN apk upgrade --no-cache -a && \
find /app/node_modules -name "*.node" -type f -exec file {} \;
FROM alpine:3.23.2
FROM alpine:3.23.3
SHELL ["/bin/ash", "-eo", "pipefail", "-c"]
ENV NODE_ENV=production
ARG LRC_VER=v0.1.32R1
@@ -176,12 +183,14 @@ COPY --from=nginx /src/attachment/attachments/nginx/nginx_attachment_util/libosr
COPY --from=backend /app /app
COPY rootfs /
COPY rootfs /
COPY LICENSE /LICENSE
COPY COPYING /COPYING
WORKDIR /app
RUN apk upgrade --no-cache -a && \
apk add --no-cache tzdata tini \
luajit pcre2 zlib brotli zstd libssl3 libcrypto3 geoip libmaxminddb-libs libldap lua5.1-cjson \
aws-lc pcre2 luajit zlib-ng brotli zstd lua5.1-cjson libxml2 libldap quickjs-ng-libs libmaxminddb-libs \
curl coreutils findutils grep jq openssl shadow su-exec util-linux-misc \
bash bash-completion nano \
logrotate goaccess fcgi \
@@ -227,9 +236,11 @@ RUN apk upgrade --no-cache -a && \
\
chmod +x /usr/local/bin/*
COPY --from=frontend /app/dist /html/frontend
COPY --from=frontend /app/dist /app/frontend
ENTRYPOINT ["tini", "--", "entrypoint.sh"]
HEALTHCHECK CMD healthcheck.sh
LABEL com.centurylinklabs.watchtower.monitor-only="true"
LABEL wud.watch="false"
LABEL wud.watch.digest="false"

218
README.md
View File

@@ -1,22 +1,22 @@
# NPMplus
This is an improved fork of the nginx-proxy-manager, see below for some changes <br>
If you don't need the web GUI of NPMplus, you may also have a look at caddy: https://caddyserver.com
- [Compatibility (to Upstream)](#compatibility-to-upstream)
- [Quick Setup](#quick-setup)
- [Migration from upstream/vanilla nginx-proxy-manager](#migration-from-upstreamvanilla-nginx-proxy-manager)
**Note: by running NPMplus you agree to the TOS of Let's Encrypt/your custom CA** <br>
**Note: remember to expose udp/quic for the https port (443/upd)** <br>
**Note: this fork is distributed under the GNU Affero General Public License version 3. It is based on the MIT licensed [nginx-proxy-manager](https://github.com/NginxProxyManager/nginx-proxy-manager).** <br>
**Note: by running NPMplus you agree to the TOS of Let's Encrypt/your custom CA.** <br>
**Note: remember to expose udp/quic for the https port (443/upd).** <br>
**Note: remember to add your domain to the [hsts preload list](https://hstspreload.org) if you enabled hsts for your domain.** <br>
**Note: please report issues first to this fork before reporting them to the upstream repository.** <br>
## List of new features
## List of some changes
- Supports HTTP/3 (QUIC), requires you to expose https with udp
- Support for crowdsec and openappsec
- Support for acme profiles (letsencrypt shotlived is used by default)
- Support for acme profiles (letsencrypt shortlived is used by default)
- Improved support for different acme servers (like ocsp/must-staple)
- OIDC support
- smaller image based on alpine
@@ -31,7 +31,8 @@ If you don't need the web GUI of NPMplus, you may also have a look at caddy: htt
- improved nginx build and nginx templates
- file and php server support (and fancyindex)
- option to edit custom certs
- gravatars are cached locally
- gravatars are cached locally and fetched by the backend (better privacy by not exposing you directly to gravatar)
- qrcodes for totp are generated locally in your browser instead of using a third-party api (better privacy/security by not exposing you and the secret to the third-party api)
- re-added some things that where removed with upstreams new frontend
- use secure cookied instead of local storage to save the token
- Password reset (only sqlite) using `docker exec -it npmplus password-reset.js USER_EMAIL PASSWORD`
@@ -42,9 +43,11 @@ If you don't need the web GUI of NPMplus, you may also have a look at caddy: htt
- I test NPMplus with docker, but podman should also work (I disrecommend you to run the NPMplus container inside an LXC container, it will work, but please don't do it, it will work better without, install docker/podman on the host or in a KVM and run NPMplus with this)
- MariaDB(/MySQL)/PostgreSQL may work as Databases for NPMplus (configuration like in upstream), but are unsupported, have no advantage over SQLite (at least with NPMplus) and are not recommended. Please note that you can't migrate from any of these to SQLite without making a fresh install and/or copying everything yourself.
- NPMplus uses https instead of http for the admin interface
- NPMplus won't trust cloudflare until you set the env SKIP_IP_RANGES to false, but please read [this](#notes-on-cloudflare) first before setting the env to true.
- route53 is not supported as dns-challenge provider and Amazon CloudFront IPs can't be automatically trusted in NPMplus, even if you set SKIP_IP_RANGES env to false.
- The following certbot dns plugins have been replaced, which means that certs using one of these proivder will not renew and need to be recreated (not renewed): `certbot-dns-he`, `certbot-dns-dnspod`, `certbot-dns-online` and `certbot-dns-do` (`certbot-dns-do` was replaced in upstream with v2.12.4 and then merged into NPMplus)
- NPMplus won't trust cloudflare until you set the env TRUST_CLOUDFLARE to true, but please read [this](#notes-on-cloudflare) first before setting the env to true.
- route53 is not supported as dns-challenge provider and Amazon CloudFront IPs can't be automatically trusted in NPMplus, even if you set TRUST_CLOUDFLARE env to true.
- The following certbot dns plugins have been replaced, which means that certs using one of these proivder will not renew and need to be recreated (not renewed): `certbot-dns-he`, `certbot-dns-dnspod`, `certbot-dns-online`, `certbot-dns-powerdns` and `certbot-dns-do` (`certbot-dns-do` was replaced in upstream with v2.12.4 and then merged into NPMplus)
- There are many changed and improvements to the nginx config, so please don't follow guides in the internet about custom/advanced config, they are either redundant or should not be used at all with NPMplus
- Many forms have changed behavior, see [Comments on some buttons](#comments-on-some-buttons)
## Quick Setup
1. Install Docker and Docker Compose (podman or docker rootless may also work)
@@ -68,10 +71,11 @@ docker compose up -d
6. stop nginx-proxy-manager
7. deploy the NPMplus compose.yaml
8. You should now remove the `/etc/letsencrypt` mount, since it was moved to `/data` while migration, then redeploy the compose file
9. Since many buttons have changed, please check if they are still correct for every host you have.
9. Since many forms have changed, please check if they are still correct for every host you have.
10. If you proxy NPM(plus) through NPM(plus) make sure to change the scheme from http to https
11. Maybe setup crowdsec (see below)
12. Please report all (migration) issues you may have
11. Because of a added CSP-rules gravatar images will not load, to fix this you need to open the form to edit a users name and save it without changes
12. Maybe setup crowdsec (see below)
13. Please report all (migration) issues you may have
# Crowdsec
<!--Note: Using Immich behind NPMplus with enabled appsec causes issues, see here: [#1241](https://github.com/ZoeyVid/NPMplus/discussions/1241) <br>-->
@@ -91,8 +95,12 @@ name: appsec
source: appsec
labels:
type: appsec
# if you use openappsec you can enable this
#---
# If you use open-appsec, uncomment the section below.
# If connecting to open-appsec cloud, you must edit the default 'log trigger'
# in the cloud dashboard: check "Log to > gateway / agent" and click 'enforce'.
# Otherwise, no intrusion events will be logged to the local agent
# for CrowdSec to process.
#source: file
#filenames:
# - /opt/openappsec/logs/cp-nano-http-transaction-handler.log*
@@ -114,7 +122,7 @@ labels:
2. Make other settings (like TLS)
3. Create a custom location `/` set the scheme to `path`, put in the path, the press the gear button and fill this in (edit the last line):
```
location ~* \.php(?:$|/) {
location ~* [^/]\.php(?:$|/) {
fastcgi_split_path_info ^(.*\.php)(/.*)$;
try_files $fastcgi_script_name =404;
fastcgi_pass ...; # set this to the address of your php-fpm (socket/tcp): https://nginx.org/en/docs/http/ngx_http_fastcgi_module.html#fastcgi_pass
@@ -126,145 +134,55 @@ location ~* \.php(?:$|/) {
2. Set the forwarding port to the php version you want to use and is supported by NPMplus (like 83/84/85)
## Comments on some buttons
- Forward Hostname / IP / Path: if the scheme is set to path you can just put here a path in and nginx works as a file server, otherwise you need to input ip/domain, you can also append a path to the ip/domain like `127.0.0.1/path` to proxy to a subpath. For custom locations a path which ends with `/` will strip the path of the location. So a request `GET /cdf/abc` to a custom location `/cdf` which proxies to `127.0.0.1/abc` will proxy to `127.0.0.1/abc/abc` and a ustom location `/cdf` which proxies to `127.0.0.1/abc/` will proxy to `127.0.0.1/abc` (same stripping applies to `path`)
- Forward Hostname / IP / Path: if the scheme is set to path you can just put here a path in and nginx works as a file server, otherwise you need to input ip/domain, you can also append a path to the ip/domain like `127.0.0.1/path` to proxy to a subpath.
- For custom locations with a set path, dns will be only refreshed on nginx reloads and the path of the location will be stripped. So a request `GET /cdf/abc` to a custom location `/cdf` which proxies to `127.0.0.1/abc` will proxy to `127.0.0.1/abc/abc`, a custom location `/cdf/` which proxies to `127.0.0.1/` will proxy to `127.0.0.1/abc` and a custom location `/cdf` which proxies to `127.0.0.1` will proxy to `127.0.0.1/cdf/abc`
- If the scheme is set to `path`, a path ending with a `/` will be searched relative to the custom location (is uses nginx alias) and a path ending without a `/` will be searched relative to the main `/` location (it uses nginx root)
- Forward Port (optional): port of upstream or php version if scheme is `path`
- Enable fancyindex/compression by upstream: for scheme set to `path` this will enabled fancyindex, which shows a index of all files in the folder if there is no index file, for proxy hosts this will allow the backend to compress files, I recommend you to keep this disabled
- Send noindex header and block some user agents: This does what is says, it appends a header to all responses which says that the site should not be indexed while blocking requests of crawlers based on the user agent sent with the request
- Disable Crowdsec Appsec: this will disable crowdsec appsec only for one host/one location, this will only do something if appsec is configured
- Disable Response Buffering: Most time you want keep buffering enabled, you may want to disable this if you for example want to stream videos and you have a fast and stable connection to the upstream server, this effects the connection from the upstream server to NPMplus
- Disable Request Buffering: Most time you want keep buffering enabled, request buffering will always be enabled if crowdsec appsec is enabled, you may want to disable this if you for example want to upload huge files and have a fast and stable connection to the upstream server, this effects the connection from the NPMplus to the upstream server
- Enable compression by upstream: this will allow the backend to compress files, I recommend you to keep this disabled, there may be cases where this is needed since otherwise the upstream missbehaves for some reason (like collabora in nextcloud all-in-one)
- Enable fancyindex: this will enabled fancyindex, which shows a index of all files in the folder if there is no index file, only enable this if you know what you are doing and you need the index
- Websockets: this button was removed, websockets are now always enabled
- Reuse Key: this will make the new cert always keep its key unless you force renew it, I recommend you to keep this disabled (not to keep the key), a reason to keep the key would be TLSA/pubkey pinning
- TLS to upstream (for Streams): This can be used if your stream target already uses tls but you want to override it with a NPMplus cert, do not enable if you don't set a new cert, since this will downgrade the connecting to be unencrypted
- X-Frame-Options: will control the X-Frame-Options header, none will remove the header, SAMEORIGIN/DENY will set it to these values and upstream will keep what upstream sends
## Examples of implementing some services using auth_request
These example need to be defined for each hosts (whitelist), if you want to configure them globally with exemptions (blacklist), please create a discussion, I can try to help you with that.
### Anubis config (supported)
1. deploy an anubis container (see the compose.yaml for an example and information)
### Anubis
1. Deploy an anubis container (see the compose.yaml for an example and information)
2. In the mounted anubis bot policy file the "status_codes" should be set to 401 and 403, like this:
```yaml
status_codes:
CHALLENGE: 401
DENY: 403
```
3. Put this in the advacned tab or create a custom location / (or the location you want to use), set your proxy settings, then press the gear button and paste the following in the new text field:
```
auth_request /.within.website/x/cmd/anubis/api/check;
error_page 401 403 =200 /.within.website/?redir=$request_uri;
```
4. Create a location with the path `/.within.website`, this should proxy to your anubis, example: `http://127.0.0.1:8923`, then press the gear button and paste the following in the new text field
```
proxy_redirect ~^[^/]+/.*$ /;
proxy_method GET;
proxy_pass_request_body off;
proxy_set_header Content-Length "";
```
5. You can override the images used by default by creating a custom location `/.within.website/x/cmd/anubis/static/img` which acts as a file server and serves the files `happy.webp`, `pensive.webp` and `reject.webp`
3. Set the AUTH_REQUEST_ANUBIS_UPSTREAM env in the NPMplus compose.yaml and select anubis in the Auth Request selection, no custom/advanced config/locations needed
4. You can override the "allow", "checking" and "blocked" images used by default by setting the `AUTH_REQUEST_ANUBIS_USE_CUSTOM_IMAGES` env to true and putting put your custom images as happy.webp, pensive.webp and reject.webp to /opt/npmplus/anubis
### Tinyauth config example (some support)
1. Put this in the advacned tab or create a custom location / (or the location you want to use), set your proxy settings, then press the gear button and paste the following in the new text field
```
auth_request /tinyauth;
error_page 401 = @tinyauth_login;
```
2. Create a custom location with the path `/tinyauth`, this should proxy to your tinyauth, example: `http://<ip>:<port>/api/auth/nginx`, then press the gear button and paste the following in the new text field
```
internal;
proxy_method GET;
proxy_pass_request_body off;
proxy_set_header Content-Length "";
```
3. Create a custom location `@tinyauth_login`, set the scheme to `empty`, then press the gear button and paste the following in the new text field, you need to replace `tinyauth.example.org` with the domain of your tinyauth.
```
internal;
return 302 http://tinyauth.example.org/login?redirect_uri=$scheme://$host$is_request_port$request_port$request_uri;
```
### Tinyauth
1. Set the AUTH_REQUEST_TINYAUTH_UPSTREAM and AUTH_REQUEST_TINYAUTH_DOMAIM env in the NPMplus compose.yaml and select tinyauth in the Auth Request selection, no custom/advanced config/locations needed
### Authelia config example (limited support)
1. Create a custom location / (or the location you want to use), set your proxy settings, then press the gear button and paste the following in the new text field or paste it in the advanced tab (but then the headers won't work):
```
auth_request /internal/authelia/authz;
auth_request_set $redirection_url $upstream_http_location;
error_page 401 =302 $redirection_url;
### Authelia (modern)
1. Set the AUTH_REQUEST_AUTHELIA_UPSTREAM env in the NPMplus compose.yaml and select authelia (modern) in the Auth Request selection, no custom/advanced config/locations needed
auth_request_set $user $upstream_http_remote_user;
auth_request_set $groups $upstream_http_remote_groups;
auth_request_set $name $upstream_http_remote_name;
auth_request_set $email $upstream_http_remote_email;
proxy_set_header Remote-User $user;
proxy_set_header Remote-Groups $groups;
proxy_set_header Remote-Email $email;
proxy_set_header Remote-Name $name;
```
2. Create a location with the path `/internal/authelia/authz`, this should proxy to your authelia, example `http://127.0.0.1:9091/api/authz/auth-request`, then press the gear button and paste the following in the new text field
```
internal;
proxy_method GET;
proxy_pass_request_body off;
proxy_set_header Content-Length "";
```
### Authentik config example (very limited support)
1. create a custom location / (or the location you want to use), set your proxy settings, then press the gear button and paste the following in the new text field or paste it in the advanced tab (but then the headers won't work), you may need to adjust the last lines:
```
auth_request /outpost.goauthentik.io/auth/nginx;
error_page 401 = @goauthentik_proxy_signin;
auth_request_set $auth_cookie $upstream_http_set_cookie;
add_header Set-Cookie $auth_cookie;
auth_request_set $authentik_username $upstream_http_x_authentik_username;
auth_request_set $authentik_groups $upstream_http_x_authentik_groups;
auth_request_set $authentik_entitlements $upstream_http_x_authentik_entitlements;
auth_request_set $authentik_email $upstream_http_x_authentik_email;
auth_request_set $authentik_name $upstream_http_x_authentik_name;
auth_request_set $authentik_uid $upstream_http_x_authentik_uid;
proxy_set_header X-authentik-username $authentik_username;
proxy_set_header X-authentik-groups $authentik_groups;
proxy_set_header X-authentik-entitlements $authentik_entitlements;
proxy_set_header X-authentik-email $authentik_email;
proxy_set_header X-authentik-name $authentik_name;
proxy_set_header X-authentik-uid $authentik_uid;
# This section should be uncommented when the "Send HTTP Basic authentication" option is enabled in the proxy provider
#auth_request_set $authentik_auth $upstream_http_authorization;
#proxy_set_header Authorization $authentik_auth;
```
2. Create a location with the path `/outpost.goauthentik.io`, this should proxy to your authentik, examples: `https://127.0.0.1:9443/outpost.goauthentik.io` for embedded outpost (or `https://127.0.0.1:9443` for manual outpost deployments), then press the gear button and paste the following in the new text field
```
auth_request_set $auth_cookie $upstream_http_set_cookie;
add_header Set-Cookie $auth_cookie;
proxy_method GET;
proxy_pass_request_body off;
proxy_set_header Content-Length "";
```
3. Create a custom location `@tinyauth_login`, set the scheme to `empty`, then press the gear button and paste the following in the new text field, you may need to adjust the last lines:
```
internal;
add_header Set-Cookie $auth_cookie;
return 302 /outpost.goauthentik.io/start?rd=$request_uri;
## For domain level, use the below error_page to redirect to your authentik server with the full redirect path
#return 302 https://authentik.company/outpost.goauthentik.io/start?rd=$scheme://$host$is_request_port$request_port$request_uri;
```
### Authentik
1. Set the AUTH_REQUEST_AUTHENTIK_UPSTREAM env (and optional AUTH_REQUEST_AUTHENTIK_DOMAIN env if you use the "domain level" variant in authentik, do not set this env if you use the "single application" variant) in the NPMplus compose.yaml and select authentik/authentik-send-basic-auth in the Auth Request selection, no custom/advanced config/locations needed
## Load Balancing
1. Open and edit this file: `/opt/npmplus/custom_nginx/http_top.conf` (or `/opt/npmplus/custom_nginx/stream_top.conf` for streams), if you changed /opt/npmplus to a different path make sure to change the path to fit
2. Set the upstream directive(s) with your servers which should be load balanced (https://nginx.org/en/docs/http/ngx_http_upstream_module.html / https://nginx.org/en/docs/stream/ngx_stream_upstream_module.html), they need to run the same protocol (either http(s) or grpc(s) for proxy hosts or tcp/udp/proxy protocol for streams), like this for example:
```
# a) at least one backend uses a different port, optionally the one external server is marked as backup
upstream server1 {
server 127.0.0.1:44;
server 127.0.0.1:33;
server 127.0.0.1:22;
server 192.158.168.11:44 backup;
}
# b) all services use the same port
upstream service2 {
server 192.158.168.14;
server 192.158.168.13;
server 192.158.168.12;
server 192.158.168.11;
}
```
3. Configure your proxy host/stream like always in the UI, but set the hostname to service1 (or service2 or however you named it), if you followed example a) you need to keep the forward port field empty (since you set the ports within the upstream directive), for b) you need to set it
3. Configure your proxy host/stream like always in the UI, but set the hostname to service1 (or service2 or however you named it) and keep the forward port field empty (since you set the ports within the upstream directive)
## Geoblocking example (mainly community support)
@@ -294,7 +212,7 @@ geoip2 /data/goaccess/geoip/GeoLite2-Country.mmdb {
#}
# uncomment if you block/don't allow IPs with unknown country codes
#geo $geo_is_private_ip {
#geo $is_private_ip {
# default no;
# 127.0.0.0/8 yes;
# 10.0.0.0/8 yes;
@@ -306,16 +224,55 @@ geoip2 /data/goaccess/geoip/GeoLite2-Country.mmdb {
# fec0::/10 yes;
#}
```
4. create a custom location / (or the location you want to use), set your proxy settings, then press the gear button and paste the following in the new text field, you may want to adjust the last lines (do not use the advanced tab as it may break cert renewals):
4a. to set it per location: create a custom location / (or the location you want to use), set your proxy settings, then press the gear button and paste the following in the new text field, you may want to adjust the last lines (do not use the advanced tab with this example as it may break cert renewals):
```yaml
# uncomment if you block/don't allow IPs with unknown country codes
#if ($geo_is_private_ip = yes) {
#if ($is_private_ip = yes) {
# set $geoip2_country_rule yes;
#}
if ($geoip2_country_rule = no) {
return 444; # this rejects the connection, but you can also return 403 to tell the client that it was denied
}
```
4b. to set it for an entire host: put this in the advanced tab:
```yaml
# uncomment if you block/don't allow IPs with unknown country codes
#if ($is_private_ip = yes) {
# set $geoip2_country_rule yes;
#}
if ($request_uri ~* "^/\.well-known/acme-challenge/") {
set $geoip2_country_rule yes;
}
if ($geoip2_country_rule = no) {
return 444; # this rejects the connection, but you can also return 403 to tell the client that it was denied
}
```
4c. to set it for all http hosts of them same type: put this in the `custom_nginx/server_proxy.conf` / `custom_nginx/server_redirect.conf` / `custom_nginx/server_dead.conf` file(s):
```yaml
# uncomment if you block/don't allow IPs with unknown country codes
#if ($is_private_ip = yes) {
# set $geoip2_country_rule yes;
#}
if ($request_uri ~* "^/\.well-known/acme-challenge/") {
set $geoip2_country_rule yes;
}
if ($geoip2_country_rule = no) {
return 444; # this rejects the connection, but you can also return 403 to tell the client that it was denied
}
```
4d. to set it for all http hosts: put this in the `custom_nginx/server_http.conf` file:
```yaml
# uncomment if you block/don't allow IPs with unknown country codes
#if ($is_private_ip = yes) {
# set $geoip2_country_rule yes;
#}
if ($request_uri ~* "^/\.well-known/acme-challenge/") {
set $geoip2_country_rule yes;
}
if ($geoip2_country_rule = no) {
return 444; # this rejects the connection, but you can also return 403 to tell the client that it was denied
}
```
5. you can create multiple rule lists by adding multiple map directive, but you need to use a unique name instead of `$geoip2_country_rule` for each rule list (you need the unique name also in the custom locations)
## Prerun scripts (EXPERT option) - if you don't know what this is, ignore it
@@ -365,6 +322,7 @@ If you need to run scripts before NPMplus launches put them under: `/opt/npmplus
- to your acme/ocsp server
- to github for a daily update check
- if not disabled gravatar for profile pictures
- if used to your OIDC
- if used to pypi to download certbot plugins
- if used to your dns provider for acme dns challenges
- if used to www.site24x7.com for the reachability check

View File

@@ -8,7 +8,11 @@ import mainRoutes from "./routes/main.js";
* App
*/
const app = express();
app.use(fileUpload());
app.use(
fileUpload({
limits: { fileSize: 1024 * 1024 },
}),
);
app.use(cookieParser());
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
@@ -21,6 +25,25 @@ app.disable("x-powered-by");
app.enable("trust proxy", ["loopback", "linklocal", "uniquelocal"]);
app.enable("strict routing");
app.use((req, res, next) => {
if (["same-origin", undefined, "none"].includes(req.get("sec-fetch-site"))) {
return next();
}
if (
req.method === "GET" &&
req.path === "/api/oidc/callback" &&
req.get("sec-fetch-mode") === "navigate" &&
req.get("sec-fetch-dest") === "document"
) {
return next();
}
res.status(403).json({
error: { message: "Rejected Sec-Fetch-Site Value." },
});
});
// pretty print JSON when not live
app.set("json spaces", 2);

View File

@@ -1,5 +1,5 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.10/schema.json",
"$schema": "https://biomejs.dev/schemas/2.4.5/schema.json",
"vcs": {
"enabled": true,
"clientKind": "git",

View File

@@ -17,6 +17,12 @@
"credentials": "dns_aliyun_access_key = 12345678\ndns_aliyun_access_key_secret = 1234567890abcdef1234567890abcdef",
"full_plugin_name": "dns-aliyun"
},
"arvan": {
"name": "ArvanCloud",
"package_name": "certbot-dns-arvan",
"credentials": "dns_arvan_key = Apikey xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"full_plugin_name": "dns-arvan"
},
"azure": {
"name": "Azure",
"package_name": "certbot-dns-azure",
@@ -227,6 +233,12 @@
"credentials": "dns_hurricane_electric_user = Me\ndns_hurricane_electric_pass = my HE password",
"full_plugin_name": "dns-hurricane_electric"
},
"he-ddns": {
"name": "Hurricane Electric - DDNS",
"package_name": "certbot-dns-he-ddns",
"credentials": "dns_he_ddns_password = verysecurepassword",
"full_plugin_name": "dns-he-ddns"
},
"hetzner": {
"name": "Hetzner",
"package_name": "certbot-dns-hetzner",
@@ -287,6 +299,12 @@
"credentials": "dns_joker_username = <Dynamic DNS Authentication Username>\ndns_joker_password = <Dynamic DNS Authentication Password>\ndns_joker_domain = <Dynamic DNS Domain>",
"full_plugin_name": "dns-joker"
},
"kas": {
"name": "All-Inkl",
"package_name": "certbot-dns-kas",
"credentials": "dns_kas_user = your_kas_user\ndns_kas_password = your_kas_password",
"full_plugin_name": "dns-kas"
},
"leaseweb": {
"name": "LeaseWeb",
"package_name": "certbot-dns-leaseweb",
@@ -379,9 +397,9 @@
},
"powerdns": {
"name": "PowerDNS",
"package_name": "certbot-dns-powerdns",
"credentials": "dns_powerdns_api_url = https://api.mypowerdns.example.org\ndns_powerdns_api_key = AbCbASsd!@34",
"full_plugin_name": "dns-powerdns"
"package_name": "certbot-dns-pdns",
"credentials": "dns_pdns_endpoint = https://pdns-api.example.com\ndns_pdns_api_key = <Your API Key>\ndns_pdns_server_id = localhost # see https://doc.powerdns.com/authoritative/http-api/server.html\ndns_pdns_disable_notify = false # Disable notification of secondaries after record changes",
"full_plugin_name": "dns-pdns"
},
"regru": {
"name": "reg.ru",
@@ -413,6 +431,12 @@
"credentials": "dns_selectel_api_v2_account_id = your_account_id\ndns_selectel_api_v2_project_name = your_project\ndns_selectel_api_v2_username = your_username\ndns_selectel_api_v2_password = your_password",
"full_plugin_name": "dns-selectel-api-v2"
},
"simply": {
"name": "Simply",
"package_name": "certbot-dns-simply",
"credentials": "dns_simply_account_name = UExxxxxx\ndns_simply_api_key = DsHJdsjh2812872sahj",
"full_plugin_name": "dns-simply"
},
"spaceship": {
"name": "Spaceship",
"package_name": "certbot-dns-spaceship",

View File

@@ -9,22 +9,18 @@ import { migrateUp } from "./migrate.js";
import { getCompiledSchema } from "./schema/index.js";
import setup from "./setup.js";
const IP_RANGES_FETCH_ENABLED = process.env.SKIP_IP_RANGES === "false";
async function appStart() {
return migrateUp()
.then(setup)
.then(getCompiledSchema)
.then(() => {
if (!IP_RANGES_FETCH_ENABLED) {
logger.info("IP Ranges fetch is disabled by environment variable");
if (process.env.TRUST_CLOUDFLARE === "false") {
logger.info("Cloudflares IPs are NOT trusted");
return;
}
logger.info("IP Ranges fetch is enabled");
logger.info("Cloudflares IPs are trusted");
internalIpRanges.initTimer();
return internalIpRanges.fetch().catch((err) => {
logger.error("IP Ranges fetch failed, continuing anyway:", err.message);
});
return internalIpRanges.fetch();
})
.then(() => {
internalCertificate.initTimer();

388
backend/internal/2fa.js Normal file
View File

@@ -0,0 +1,388 @@
import crypto from "node:crypto";
import bcrypt from "bcryptjs";
import { createGuardrails, generateSecret, generateURI, verify } from "otplib";
import errs from "../lib/error.js";
import authModel from "../models/auth.js";
import internalUser from "./user.js";
const APP_NAME = "NPMplus";
const BACKUP_CODE_COUNT = 8;
/**
* Generate backup codes
* @returns {Promise<{plain: string[], hashed: string[]}>}
*/
const generateBackupCodes = async () => {
const plain = [];
const hashed = [];
for (let i = 0; i < BACKUP_CODE_COUNT; i++) {
const code = crypto.randomBytes(4).toString("hex").toUpperCase();
plain.push(code);
const hash = await bcrypt.hash(code, 10);
hashed.push(hash);
}
return { plain, hashed };
};
const internal2fa = {
/**
* Check if user has 2FA enabled
* @param {number} userId
* @returns {Promise<boolean>}
*/
isEnabled: async (userId) => {
const auth = await internal2fa.getUserPasswordAuth(userId);
return auth?.meta?.totp_enabled === true;
},
/**
* Get 2FA status for user
* @param {Access} access
* @param {number} userId
* @returns {Promise<{enabled: boolean, backup_codes_remaining: number}>}
*/
getStatus: async (access, userId) => {
await access.can("users:password", userId);
await internalUser.get(access, { id: userId });
const auth = await internal2fa.getUserPasswordAuth(userId);
const enabled = auth?.meta?.totp_enabled === true;
let backup_codes_remaining = 0;
if (enabled) {
const backupCodes = auth.meta.backup_codes || [];
backup_codes_remaining = backupCodes.length;
}
return {
enabled,
backup_codes_remaining,
};
},
/**
* Start 2FA setup - store pending secret
*
* @param {Access} access
* @param {number} userId
* @returns {Promise<{secret: string, otpauth_url: string}>}
*/
startSetup: async (access, userId) => {
await access.can("users:password", userId);
const user = await internalUser.get(access, { id: userId });
const secret = generateSecret();
const otpauth_url = generateURI({
issuer: APP_NAME,
label: user.email,
secret: secret,
});
const auth = await internal2fa.getUserPasswordAuth(userId);
// ensure user isn't already setup for 2fa
const enabled = auth?.meta?.totp_enabled === true;
if (enabled) {
throw new errs.ValidationError("2FA is already enabled");
}
const meta = auth.meta || {};
meta.totp_pending_secret = secret;
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
return { secret, otpauth_url };
},
/**
* Enable 2FA after verifying code
*
* @param {Access} access
* @param {number} userId
* @param {string} code
* @returns {Promise<{backup_codes: string[]}>}
*/
enable: async (access, userId, code) => {
await access.can("users:password", userId);
await internalUser.get(access, { id: userId });
const auth = await internal2fa.getUserPasswordAuth(userId);
const secret = auth?.meta?.totp_pending_secret || false;
if (!secret) {
throw new errs.ValidationError("No pending 2FA setup found");
}
const codeTrim = code.trim();
const result = await verify({ token: codeTrim, secret });
if (!result.valid) {
throw new errs.ValidationError("Invalid verification code");
}
const { plain, hashed } = await generateBackupCodes();
const meta = {
...auth.meta,
totp_secret: secret,
totp_enabled: true,
totp_enabled_at: new Date().toISOString(),
backup_codes: hashed,
};
delete meta.totp_pending_secret;
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
return { backup_codes: plain };
},
/**
* Disable 2FA
*
* @param {Access} access
* @param {number} userId
* @param {string} code
* @returns {Promise<void>}
*/
disable: async (access, userId, code) => {
await access.can("users:password", userId);
await internalUser.get(access, { id: userId });
const auth = await internal2fa.getUserPasswordAuth(userId);
const enabled = auth?.meta?.totp_enabled === true;
if (!enabled) {
throw new errs.ValidationError("2FA is not enabled");
}
const codeTrim = code.trim();
if (codeTrim.length !== 6 && codeTrim.length !== 8) {
throw new errs.ValidationError("Invalid verification code");
}
// Try TOTP code first, if it's 6 chars. it will throw errors if it's not 6 chars
// and the backup codes are 8 chars.
if (codeTrim.length === 6) {
const result = await verify({
token: codeTrim,
secret: auth.meta.totp_secret,
// These guardrails lower the minimum length requirement for secrets.
// In v12 of otplib the default minimum length is 10 and in v13 it is 16.
// Since there are 2fa secrets in the wild generated with v12 we need to allow shorter secrets
// so people won't be locked out when upgrading.
guardrails: createGuardrails({
MIN_SECRET_BYTES: 10,
}),
});
if (!result.valid) {
throw new errs.ValidationError("Invalid verification code");
}
}
// Try backup codes
if (codeTrim.length === 8) {
const backupCodes = auth?.meta?.backup_codes || [];
let invalid = true;
for (let i = 0; i < backupCodes.length; i++) {
const match = await bcrypt.compare(codeTrim.toUpperCase(), backupCodes[i]);
if (match) {
// Remove used backup code
const updatedCodes = [...backupCodes];
updatedCodes.splice(i, 1);
const meta = { ...auth.meta, backup_codes: updatedCodes };
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
invalid = false;
}
}
if (invalid) {
throw new errs.ValidationError("Invalid verification code");
}
}
const meta = { ...auth.meta };
delete meta.totp_secret;
delete meta.totp_enabled;
delete meta.totp_enabled_at;
delete meta.backup_codes;
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
},
/**
* Verify 2FA code for login
*
* @param {number} userId
* @param {string} token
* @returns {Promise<boolean>}
*/
verifyForLogin: async (userId, token) => {
const auth = await internal2fa.getUserPasswordAuth(userId);
const secret = auth?.meta?.totp_secret || false;
if (!secret) {
return false;
}
const tokenTrim = token.trim();
// Try TOTP code first, if it's 6 chars. it will throw errors if it's not 6 chars
// and the backup codes are 8 chars.
if (tokenTrim.length === 6) {
const result = await verify({
token: tokenTrim,
secret,
// These guardrails lower the minimum length requirement for secrets.
// In v12 of otplib the default minimum length is 10 and in v13 it is 16.
// Since there are 2fa secrets in the wild generated with v12 we need to allow shorter secrets
// so people won't be locked out when upgrading.
guardrails: createGuardrails({
MIN_SECRET_BYTES: 10,
}),
});
return result.valid;
}
// Try backup codes
if (tokenTrim.length === 8) {
const backupCodes = auth?.meta?.backup_codes || [];
for (let i = 0; i < backupCodes.length; i++) {
const match = await bcrypt.compare(tokenTrim.toUpperCase(), backupCodes[i]);
if (match) {
// Remove used backup code
const updatedCodes = [...backupCodes];
updatedCodes.splice(i, 1);
const meta = { ...auth.meta, backup_codes: updatedCodes };
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
return true;
}
}
}
return false;
},
/**
* Regenerate backup codes
*
* @param {Access} access
* @param {number} userId
* @param {string} token
* @returns {Promise<{backup_codes: string[]}>}
*/
regenerateBackupCodes: async (access, userId, token) => {
await access.can("users:password", userId);
await internalUser.get(access, { id: userId });
const auth = await internal2fa.getUserPasswordAuth(userId);
const enabled = auth?.meta?.totp_enabled === true;
const secret = auth?.meta?.totp_secret || false;
if (!enabled) {
throw new errs.ValidationError("2FA is not enabled");
}
if (!secret) {
throw new errs.ValidationError("No 2FA secret found");
}
const tokenTrim = token.trim();
if (tokenTrim.length !== 6 && tokenTrim.length !== 8) {
throw new errs.ValidationError("Invalid verification code");
}
// Try TOTP code first, if it's 6 chars. it will throw errors if it's not 6 chars
// and the backup codes are 8 chars.
if (tokenTrim.length === 6) {
const result = await verify({
token: tokenTrim,
secret,
// These guardrails lower the minimum length requirement for secrets.
// In v12 of otplib the default minimum length is 10 and in v13 it is 16.
// Since there are 2fa secrets in the wild generated with v12 we need to allow shorter secrets
// so people won't be locked out when upgrading.
guardrails: createGuardrails({
MIN_SECRET_BYTES: 10,
}),
});
if (!result.valid) {
throw new errs.ValidationError("Invalid verification code");
}
}
// Try backup codes
if (tokenTrim.length === 8) {
const backupCodes = auth?.meta?.backup_codes || [];
let invalid = true;
for (let i = 0; i < backupCodes.length; i++) {
const match = await bcrypt.compare(tokenTrim.toUpperCase(), backupCodes[i]);
if (match) {
// Remove used backup code
const updatedCodes = [...backupCodes];
updatedCodes.splice(i, 1);
const meta = { ...auth.meta, backup_codes: updatedCodes };
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
invalid = false;
}
}
if (invalid) {
throw new errs.ValidationError("Invalid verification code");
}
}
const { plain, hashed } = await generateBackupCodes();
const meta = { ...auth.meta, backup_codes: hashed };
await authModel
.query()
.where("id", auth.id)
.andWhere("user_id", userId)
.andWhere("type", "password")
.patch({ meta });
return { backup_codes: plain };
},
getUserPasswordAuth: async (userId) => {
const auth = await authModel.query().where("user_id", userId).andWhere("type", "password").first();
if (!auth) {
throw new errs.ItemNotFoundError("Auth not found");
}
return auth;
},
};
export default internal2fa;

View File

@@ -42,7 +42,7 @@ const internalAccessList = {
accessListAuthModel.query().insert({
access_list_id: row.id,
username: item.username,
password: item.password,
password: bcrypt.hashSync(item.password, 6),
}),
);
return true;
@@ -129,7 +129,7 @@ const internalAccessList = {
accessListAuthModel.query().insert({
access_list_id: data.id,
username: item.username,
password: item.password,
password: bcrypt.hashSync(item.password, 6),
}),
);
} else {
@@ -432,7 +432,7 @@ const internalAccessList = {
logger.info(`Adding: ${item.username}`);
try {
fs.appendFileSync(htpasswdFile, `${item.username}:${await bcrypt.hash(item.password, 13)}\n`, {
fs.appendFileSync(htpasswdFile, `${item.username}:${item.password}\n`, {
encoding: "utf8",
});
} catch (err) {

View File

@@ -1,10 +1,11 @@
import { createPrivateKey, X509Certificate } from "node:crypto";
import { mkdir, readFile, rm, writeFile } from "node:fs/promises";
import fs from "node:fs";
import path from "node:path";
import { domainToASCII } from "node:url";
import archiver from "archiver";
import dayjs from "dayjs";
import _ from "lodash";
import moment from "moment";
import tempWrite from "temp-write";
import dnsPlugins from "../certbot/dns-plugins.json" with { type: "json" };
import { installPlugin } from "../lib/certbot.js";
import error from "../lib/error.js";
@@ -20,7 +21,7 @@ const omissions = () => {
};
const internalCertificate = {
allowedSslFiles: ["certificate", "certificate_key", "intermediate_certificate"],
allowedSslFiles: ["certificate", "certificate_key"],
intervalTimeout: 1000 * 60 * 60 * Number.parseInt(process.env.CRT, 10),
interval: null,
intervalProcessing: false,
@@ -38,74 +39,68 @@ const internalCertificate = {
/**
* Triggered by a timer, this will check for expiring hosts and renew their tls certs if required
*/
processExpiringHosts: () => {
if (!internalCertificate.intervalProcessing) {
internalCertificate.intervalProcessing = true;
logger.info("Renewing Certbot TLS certs close to expiry...");
processExpiringHosts: async () => {
if (internalCertificate.intervalProcessing) {
return;
}
return utils
.execFile("certbot", [
internalCertificate.intervalProcessing = true;
logger.info("Renewing Certbot TLS certs close to expiry...");
try {
try {
const result = await utils.execFile("certbot", [
"--config",
"/etc/certbot.ini",
"renew",
"--server",
process.env.ACME_SERVER,
"--quiet",
])
.then((result) => {
if (result) {
logger.info(`Renew Result: ${result}`);
]);
if (result) logger.info(`Renew Result: ${result}`);
} catch (err) {
logger.warn(`Certbot completed with errors: ${err}`);
}
try {
await internalNginx.reload();
} catch (err) {
logger.error(err);
}
const certificates = await certificateModel
.query()
.where("is_deleted", 0)
.andWhere("provider", "letsencrypt");
if (certificates && certificates.length > 0) {
const updatePromises = certificates.map(async (certificate) => {
try {
const certInfo = await internalCertificate.getCertificateInfoFromFile(
`${internalCertificate.getLiveCertPath(certificate.id)}/fullchain.pem`,
);
await certificateModel
.query()
.where("id", certificate.id)
.andWhere("provider", "letsencrypt")
.patch({
expires_on: dayjs.unix(certInfo.dates.to).format("YYYY-MM-DD HH:mm:ss"),
});
} catch (err) {
// Don't want to stop the train here, just log the error
logger.error(err);
}
return internalNginx.reload().then(() => {
logger.info("Renew Complete");
return result;
});
})
.then(() => {
// Now go and fetch all the certbot certs from the db and query the files and update expiry times
return certificateModel
.query()
.where("is_deleted", 0)
.andWhere("provider", "letsencrypt")
.then((certificates) => {
if (certificates && certificates.length > 0) {
const promises = [];
certificates.map((certificate) => {
promises.push(
internalCertificate
.getCertificateInfoFromFile(
`${internalCertificate.getLiveCertPath(certificate.id)}/fullchain.pem`,
)
.then((cert_info) => {
return certificateModel
.query()
.where("id", certificate.id)
.andWhere("provider", "letsencrypt")
.patch({
expires_on: moment(cert_info.dates.to, "X").format(
"YYYY-MM-DD HH:mm:ss",
),
});
})
.catch((err) => {
// Don't want to stop the train here, just log the error
logger.error(err.message);
}),
);
return Promise.all(promises);
});
}
});
})
.then(() => {
internalCertificate.intervalProcessing = false;
})
.catch((err) => {
logger.error(err);
internalCertificate.intervalProcessing = false;
});
await Promise.all(updatePromises);
logger.info("Renew Complete");
}
} catch (err) {
logger.error(err);
} finally {
internalCertificate.intervalProcessing = false;
}
},
@@ -143,7 +138,7 @@ const internalCertificate = {
const savedRow = await certificateModel
.query()
.patchAndFetchById(certificate.id, {
expires_on: moment(certInfo.dates.to, "X").format("YYYY-MM-DD HH:mm:ss"),
expires_on: dayjs.unix(certInfo.dates.to).format("YYYY-MM-DD HH:mm:ss"),
})
.then(utils.omitRow(omissions()));
@@ -367,8 +362,8 @@ const internalCertificate = {
// Revoke the cert
await internalCertificate.revokeCertbot(row);
} else {
fs.rmSync(`/data/tls/custom/npm-${row.id}`, { force: true, recursive: true });
fs.rmSync(`/data/tls/custom/npm-${row.id}.der`, { force: true });
await rm(`/data/tls/custom/npm-${row.id}`, { force: true, recursive: true });
await rm(`/data/tls/custom/npm-${row.id}.der`, { force: true });
}
return true;
},
@@ -436,48 +431,17 @@ const internalCertificate = {
* @returns {Promise}
*/
writeCustomCert: async (certificate) => {
logger.info("Writing Custom Certificate:", certificate);
if (certificate.provider === "letsencrypt") {
throw new Error("Refusing to write certbot certs here");
}
logger.info("Writing Custom Certificate:", certificate.id);
const dir = `/data/tls/custom/npm-${certificate.id}`;
return new Promise((resolve, reject) => {
if (certificate.provider === "letsencrypt") {
reject(new Error("Refusing to write certbot certs here"));
return;
}
let certData = certificate.meta.certificate;
if (typeof certificate.meta.intermediate_certificate !== "undefined") {
certData = `${certData}\n${certificate.meta.intermediate_certificate}`;
}
try {
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
}
} catch (err) {
reject(err);
return;
}
fs.writeFile(`${dir}/fullchain.pem`, certData, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
}).then(() => {
return new Promise((resolve, reject) => {
fs.writeFile(`${dir}/privkey.pem`, certificate.meta.certificate_key, (err) => {
if (err) {
reject(err);
} else {
resolve();
}
});
});
});
await mkdir(dir, { recursive: true });
await writeFile(`${dir}/fullchain.pem`, certificate.meta.certificate);
await writeFile(`${dir}/privkey.pem`, certificate.meta.certificate_key);
},
/**
@@ -502,40 +466,22 @@ const internalCertificate = {
* @param {Object} data.files
* @returns {Promise}
*/
validate: (data) => {
// Put file contents into an object
const files = {};
_.map(data.files, (file, name) => {
if (internalCertificate.allowedSslFiles.indexOf(name) !== -1) {
files[name] = file.data.toString();
validate: async (data) => {
const finalData = {};
for (const [name, file] of Object.entries(data.files)) {
if (internalCertificate.allowedSslFiles.includes(name)) {
const content = file.data.toString();
let res;
if (name === "certificate_key") {
res = await internalCertificate.checkPrivateKey(content);
} else {
res = await internalCertificate.getCertificateInfo(content, true);
}
finalData[name] = res;
}
});
}
// For each file, create a temp file and write the contents to it
// Then test it depending on the file type
const promises = [];
_.map(files, (content, type) => {
promises.push(
new Promise((resolve) => {
if (type === "certificate_key") {
resolve(internalCertificate.checkPrivateKey(content));
} else {
// this should handle `certificate` and intermediate certificate
resolve(internalCertificate.getCertificateInfo(content, true));
}
}).then((res) => {
return { [type]: res };
}),
);
});
return Promise.all(promises).then((files) => {
let data = {};
_.each(files, (file) => {
data = _.assign({}, data, file);
});
return data;
});
return finalData;
},
/**
@@ -552,26 +498,27 @@ const internalCertificate = {
}
const validations = await internalCertificate.validate(data);
if (typeof validations.certificate === "undefined") {
throw new error.ValidationError("Certificate file was not provided");
if (typeof validations.certificate === "undefined" || typeof validations.certificate_key === "undefined") {
throw new error.ValidationError("Certificate and Certificate Key files were not provided");
}
const certs = {};
_.map(data.files, (file, name) => {
if (internalCertificate.allowedSslFiles.indexOf(name) !== -1) {
row.meta[name] = file.data.toString();
certs[name] = file.data.toString();
}
});
const certificate = await internalCertificate.update(access, {
id: data.id,
expires_on: moment(validations.certificate.dates.to, "X").format("YYYY-MM-DD HH:mm:ss"),
domain_names: [validations.certificate.cn],
expires_on: dayjs.unix(validations.certificate.dates.to).format("YYYY-MM-DD HH:mm:ss"),
domain_names: validations.certificate.cn,
meta: _.clone(row.meta), // Prevent the update method from changing this value that we'll use later
});
certificate.meta = row.meta;
certificate.meta = _.assign({}, row.meta, certs);
await internalCertificate.writeCustomCert(certificate);
return _.pick(row.meta, internalCertificate.allowedSslFiles);
return _.omit(certificate.meta, internalCertificate.allowedSslFiles);
},
/**
@@ -581,24 +528,10 @@ const internalCertificate = {
* @param {String} privateKey This is the entire key contents as a string
*/
checkPrivateKey: async (privateKey) => {
const filepath = await tempWrite(privateKey, "/tmp");
const failTimeout = setTimeout(() => {
throw new error.ValidationError(
"Result Validation Error: Validation timed out. This could be due to the key being passphrase-protected.",
);
}, 10000);
try {
const result = await utils.execFile("openssl", ["pkey", "-in", filepath, "-check", "-noout"]);
clearTimeout(failTimeout);
if (!result.toLowerCase().includes("key is valid")) {
throw new error.ValidationError(`Result Validation Error: ${result}`);
}
fs.unlinkSync(filepath);
createPrivateKey(privateKey);
return true;
} catch (err) {
clearTimeout(failTimeout);
fs.unlinkSync(filepath);
throw new error.ValidationError(`Certificate Key is not valid (${err.message})`, err);
}
},
@@ -611,77 +544,38 @@ const internalCertificate = {
* @param {Boolean} [throwExpired] Throw when the certificate is out of date
*/
getCertificateInfo: async (certificate, throwExpired) => {
try {
const filepath = await tempWrite(certificate, "/tmp");
const certData = await internalCertificate.getCertificateInfoFromFile(filepath, throwExpired);
fs.unlinkSync(filepath);
return certData;
} catch (err) {
fs.unlinkSync(filepath);
throw err;
}
},
/**
* Uses the openssl command to both validate and get info out of the certificate.
* It will save the file to disk first, then run commands on it, then delete the file.
*
* @param {String} certificateFile The file location on disk
* @param {Boolean} [throw_expired] Throw when the certificate is out of date
*/
getCertificateInfoFromFile: async (certificateFile, throw_expired) => {
const certData = {};
try {
const result = await utils.execFile("openssl", ["x509", "-in", certificateFile, "-subject", "-noout"]);
// Examples:
// subject=CN = *.jc21.com
// subject=CN = something.example.com
const regex = /(?:subject=)?[^=]+=\s*(\S+)/gim;
const match = regex.exec(result);
if (match && typeof match[1] !== "undefined") {
certData.cn = match[1];
}
const cert = new X509Certificate(certificate);
const result2 = await utils.execFile("openssl", ["x509", "-in", certificateFile, "-issuer", "-noout"]);
// Examples:
// issuer=C = US, O = Let's Encrypt, CN = Let's Encrypt Authority X3
// issuer=C = US, O = Let's Encrypt, CN = E5
// issuer=O = NginxProxyManager, CN = NginxProxyManager Intermediate CA","O = NginxProxyManager, CN = NginxProxyManager Intermediate CA
const regex2 = /^(?:issuer=)?(.*)$/gim;
const match2 = regex2.exec(result2);
if (match2 && typeof match2[1] !== "undefined") {
certData.issuer = match2[1];
}
const result3 = await utils.execFile("openssl", ["x509", "-in", certificateFile, "-dates", "-noout"]);
// notBefore=Jul 14 04:04:29 2018 GMT
// notAfter=Oct 12 04:04:29 2018 GMT
let validFrom = null;
let validTo = null;
const lines = result3.split("\n");
lines.map((str) => {
const regex = /^(\S+)=(.*)$/gim;
const match = regex.exec(str.trim());
if (match && typeof match[2] !== "undefined") {
const date = Number.parseInt(moment(match[2], "MMM DD HH:mm:ss YYYY z").format("X"), 10);
if (match[1].toLowerCase() === "notbefore") {
validFrom = date;
} else if (match[1].toLowerCase() === "notafter") {
validTo = date;
}
if (cert.subjectAltName) {
certData.cn = cert.subjectAltName.split(", ").map((entry) => {
const firstColonIdx = entry.indexOf(":");
return firstColonIdx === -1 ? entry.trim() : entry.substring(firstColonIdx + 1).trim();
});
} else {
const cnMatch = /\bCN=([^\n]+)/i.exec(cert.subject);
if (cnMatch?.[1]) {
certData.cn = [cnMatch[1].trim()];
} else {
certData.cn = [];
}
return true;
});
if (!validFrom || !validTo) {
throw new error.ValidationError(`Could not determine dates from certificate: ${result}`);
}
if (throw_expired && validTo < Number.parseInt(moment().format("X"), 10)) {
if (cert.issuer) {
certData.issuer = cert.issuer.replace(/\n/g, ", ");
}
const validFrom = Math.floor(new Date(cert.validFrom).getTime() / 1000);
const validTo = Math.floor(new Date(cert.validTo).getTime() / 1000);
if (Number.isNaN(validFrom) || Number.isNaN(validTo)) {
throw new error.ValidationError("Could not determine dates from certificate");
}
const now = Math.floor(Date.now() / 1000);
if (throwExpired && validTo < now) {
throw new error.ValidationError("Certificate has expired");
}
@@ -696,6 +590,18 @@ const internalCertificate = {
}
},
/**
* Uses the openssl command to both validate and get info out of the certificate.
* It will save the file to disk first, then run commands on it, then delete the file.
*
* @param {String} certificateFile The file location on disk
* @param {Boolean} [throwExpired] Throw when the certificate is out of date
*/
getCertificateInfoFromFile: async (certificateFile, throwExpired) => {
const certContent = await readFile(certificateFile);
return internalCertificate.getCertificateInfo(certContent, throwExpired);
},
/**
* Cleans the tls keys from the meta object and sets them
* @param {String} email the email address to use for registration to "true"
@@ -758,11 +664,11 @@ const internalCertificate = {
await installPlugin(certificate.meta.dns_provider);
logger.info(
`Requesting LetsEncrypt certificates via ${dnsPlugin.name} for Cert #${certificate.id}: ${certificate.domain_names.join(", ")}`,
`Requesting Certbot certificates via ${dnsPlugin.name} for Cert #${certificate.id}: ${certificate.domain_names.join(", ")}`,
);
const credentialsLocation = `/tmp/certbot-credentials/credentials-${certificate.id}`;
fs.writeFileSync(credentialsLocation, certificate.meta.dns_provider_credentials, { mode: 0o600 });
await writeFile(credentialsLocation, certificate.meta.dns_provider_credentials, { mode: 0o600 });
try {
const result = await utils.execFile("certbot", [
@@ -788,8 +694,7 @@ const internalCertificate = {
logger.info(result);
return result;
} catch (err) {
// Don't fail if file does not exist, so no need for action in the callback
fs.unlink(credentialsLocation, () => {});
await rm(credentialsLocation, { force: true });
throw err;
}
},
@@ -815,7 +720,7 @@ const internalCertificate = {
);
const updatedCertificate = await certificateModel.query().patchAndFetchById(certificate.id, {
expires_on: moment(certInfo.dates.to, "X").format("YYYY-MM-DD HH:mm:ss"),
expires_on: dayjs.unix(certInfo.dates.to).format("YYYY-MM-DD HH:mm:ss"),
});
// Add to audit log
@@ -884,7 +789,7 @@ const internalCertificate = {
}
logger.info(
`Renewing LetsEncrypt certificates via ${dnsPlugin.name} for Cert #${certificate.id}: ${certificate.domain_names.join(", ")}`,
`Renewing Certbot certificates via ${dnsPlugin.name} for Cert #${certificate.id}: ${certificate.domain_names.join(", ")}`,
);
try {
@@ -939,7 +844,7 @@ const internalCertificate = {
"unspecified",
"--delete-after-revoke",
]);
fs.rmSync(`/data/tls/certbot/live/npm-${certificate.id}.der`, { force: true });
await rm(`/data/tls/certbot/live/npm-${certificate.id}.der`, { force: true });
logger.info(result);
return result;
} catch (err) {
@@ -963,7 +868,7 @@ const internalCertificate = {
const testChallengeDir = "/data/tls/certbot/acme-challenge/.well-known/acme-challenge";
const testChallengeFile = `${testChallengeDir}/test-challenge`;
fs.mkdirSync(testChallengeDir, { recursive: true });
fs.writeFileSync(testChallengeFile, "Success", { encoding: "utf8" });
await writeFile(testChallengeFile, "Success", { encoding: "utf8" });
const results = [];
@@ -975,7 +880,7 @@ const internalCertificate = {
}
// Remove the test challenge file
fs.unlinkSync(testChallengeFile);
await rm(testChallengeFile, { force: true });
return results;
},

View File

@@ -1,4 +1,4 @@
import fs from "node:fs";
import { readFile, writeFile } from "node:fs/promises";
import { dirname } from "node:path";
import { fileURLToPath } from "node:url";
import utils from "../lib/utils.js";
@@ -82,20 +82,21 @@ const internalIpRanges = {
generateConfig: async (ip_ranges) => {
try {
const renderEngine = utils.getRenderEngine();
const template = fs.readFileSync(`${__dirname}/../templates/ip_ranges.conf`, { encoding: "utf8" });
const template = await readFile(`${__dirname}/../templates/ip_ranges.conf`, { encoding: "utf8" });
const newConfig = await renderEngine.parseAndRender(template, { ip_ranges: ip_ranges });
const filePath = "/usr/local/nginx/conf/conf.d/ip_ranges.conf";
if (fs.existsSync("/usr/local/nginx/conf/conf.d/ip_ranges.conf")) {
const oldConfig = fs.readFileSync("/usr/local/nginx/conf/conf.d/ip_ranges.conf", {
try {
const oldConfig = await readFile(filePath, {
encoding: "utf8",
});
if (oldConfig === newConfig) {
logger.info("Not updating Cloudflared IPs");
return false;
}
}
} catch {}
fs.writeFileSync("/usr/local/nginx/conf/conf.d/ip_ranges.conf", newConfig, { encoding: "utf8" });
await writeFile(filePath, newConfig, { encoding: "utf8" });
logger.info("Updated Cloudflared IPs");
return true;
} catch (err) {

View File

@@ -1,4 +1,4 @@
import fs from "node:fs";
import { readFile, rename, rm, writeFile } from "node:fs/promises";
import { dirname } from "node:path";
import { domainToASCII, fileURLToPath } from "node:url";
import _ from "lodash";
@@ -24,111 +24,81 @@ const internalNginx = {
* @param {Object} host
* @returns {Promise}
*/
configure: (model, host_type, host) => {
configure: async (model, host_type, host) => {
let combined_meta = {};
return internalNginx
.test()
.then(() => {
return internalNginx.deleteConfig(host_type, host);
})
.then(() => {
return internalNginx.reload();
})
.then(() => {
return internalNginx.generateConfig(host_type, host);
})
.then(() => {
// Test nginx again and update meta with result
return internalNginx
.test()
.then(() => {
// nginx is ok
combined_meta = _.assign({}, host.meta, {
nginx_online: true,
nginx_err: null,
});
await internalNginx.deleteConfig(host_type, host);
await internalNginx.generateConfig(host_type, host);
return model.query().where("id", host.id).patch({
meta: combined_meta,
});
})
.catch((err) => {
logger.error(err.message);
// config is bad, update meta and rename config
combined_meta = _.assign({}, host.meta, {
nginx_online: false,
nginx_err: err.message,
});
return model
.query()
.where("id", host.id)
.patch({
meta: combined_meta,
})
.then(() => {
internalNginx.renameConfigAsError(host_type, host);
});
});
})
.then(() => {
return internalNginx.reload();
})
.then(() => {
return combined_meta;
try {
await internalNginx.test();
combined_meta = _.assign({}, host.meta, {
nginx_online: true,
nginx_err: null,
});
await model.query().where("id", host.id).patch({
meta: combined_meta,
});
} catch (err) {
logger.error(err.message);
// config is bad, update meta and rename config
combined_meta = _.assign({}, host.meta, {
nginx_online: false,
nginx_err: err.message,
});
await model.query().where("id", host.id).patch({
meta: combined_meta,
});
await internalNginx.renameConfigAsError(host_type, host);
}
await internalNginx.reload();
return combined_meta;
},
/**
* @returns {Promise}
*/
test: () => {
test: async () => {
return utils.execFile("nginx", ["-tq"]);
},
/**
* @returns {Promise}
*/
reload: () => {
const promises = [];
reload: async () => {
if (process.env.ACME_OCSP_STAPLING === "true") {
promises.push(
utils
.execFile("certbot-ocsp-fetcher.sh", [
"-c",
"/data/tls/certbot/live",
"-o",
"/data/tls/certbot/live",
"--no-reload-webserver",
"--quiet",
])
.catch(() => {}),
);
try {
await utils.execFile("certbot-ocsp-fetcher.sh", [
"-c",
"/data/tls/certbot/live",
"-o",
"/data/tls/certbot/live",
"--no-reload-webserver",
"--quiet",
]);
} catch {}
}
if (process.env.CUSTOM_OCSP_STAPLING === "true") {
promises.push(
utils
.execFile("certbot-ocsp-fetcher.sh", [
"-c",
"/data/tls/custom",
"-o",
"/data/tls/custom",
"--no-reload-webserver",
"--quiet",
])
.catch(() => {}),
);
try {
await utils.execFile("certbot-ocsp-fetcher.sh", [
"-c",
"/data/tls/custom",
"-o",
"/data/tls/custom",
"--no-reload-webserver",
"--quiet",
]);
} catch {}
}
return Promise.all(promises).finally(() => {
return internalNginx.test().then(() => {
return utils.execFile("nginx", ["-s", "reload"]);
});
});
await internalNginx.test();
return utils.execFile("nginx", ["-s", "reload"]);
},
/**
@@ -148,41 +118,40 @@ const internalNginx = {
* @param {Object} host
* @returns {Promise}
*/
renderLocations: (host) => {
return new Promise((resolve, reject) => {
let template;
renderLocations: async (host) => {
let template;
try {
template = fs.readFileSync(`${__dirname}/../templates/_proxy_host_custom_location.conf`, {
encoding: "utf8",
});
} catch (err) {
reject(new errs.ConfigurationError(err.message));
return;
try {
template = await readFile(`${__dirname}/../templates/_proxy_host_custom_location.conf`, {
encoding: "utf8",
});
} catch (err) {
throw new errs.ConfigurationError(err.message);
}
const renderEngine = utils.getRenderEngine();
let renderedLocations = "";
for (const location of host.locations) {
if (location.npmplus_enabled === false) {
continue;
}
const renderEngine = utils.getRenderEngine();
let renderedLocations = "";
if (
location.forward_host.indexOf("/") > -1 &&
!location.forward_host.startsWith("/") &&
!location.forward_host.startsWith("unix")
) {
const split = location.forward_host.split("/");
const locationRendering = async () => {
for (let i = 0; i < host.locations.length; i++) {
if (
host.locations[i].forward_host.indexOf("/") > -1 &&
!host.locations[i].forward_host.startsWith("/") &&
!host.locations[i].forward_host.startsWith("unix")
) {
const split = host.locations[i].forward_host.split("/");
location.forward_host = split.shift();
location.forward_path = `/${split.join("/")}`;
}
host.locations[i].forward_host = split.shift();
host.locations[i].forward_path = `/${split.join("/")}`;
}
renderedLocations += await renderEngine.parseAndRender(template, location);
}
renderedLocations += await renderEngine.parseAndRender(template, host.locations[i]);
}
};
locationRendering().then(() => resolve(renderedLocations));
});
return renderedLocations;
},
/**
@@ -190,117 +159,97 @@ const internalNginx = {
* @param {Object} host
* @returns {Promise}
*/
generateConfig: (host_type, host_row) => {
generateConfig: async (host_type, host_row) => {
// Prevent modifying the original object:
const host = JSON.parse(JSON.stringify(host_row));
const nice_host_type = internalNginx.getFileFriendlyHostType(host_type);
const renderEngine = utils.getRenderEngine();
return new Promise((resolve, reject) => {
let template = null;
const filename = internalNginx.getConfigName(nice_host_type, host.id);
let template = null;
const filename = internalNginx.getConfigName(nice_host_type, host.id);
try {
template = fs.readFileSync(`${__dirname}/../templates/${nice_host_type}.conf`, { encoding: "utf8" });
} catch (err) {
reject(new errs.ConfigurationError(err.message));
return;
}
let locationsPromise;
let origLocations;
// Manipulate the data a bit before sending it to the template
if (nice_host_type !== "default") {
host.use_default_location = true;
if (typeof host.advanced_config !== "undefined" && host.advanced_config) {
host.use_default_location = !internalNginx.advancedConfigHasDefaultLocation(host.advanced_config);
}
}
// For redirection hosts, if the scheme is not http or https, set it to $scheme
if (
nice_host_type === "redirection_host" &&
["http", "https"].indexOf(host.forward_scheme.toLowerCase()) === -1
) {
host.forward_scheme = "$scheme";
}
if (host.locations) {
//logger.info ('host.locations = ' + JSON.stringify(host.locations, null, 2));
origLocations = [].concat(host.locations);
locationsPromise = internalNginx.renderLocations(host).then((renderedLocations) => {
host.locations = renderedLocations;
});
// Allow someone who is using / custom location path to use it, and skip the default / location
_.map(host.locations, (location) => {
if (location.path === "/" && location.location_type !== "= ") {
host.use_default_location = false;
}
});
} else {
locationsPromise = Promise.resolve();
}
if (
host.forward_host &&
host.forward_host.indexOf("/") > -1 &&
!host.forward_host.startsWith("/") &&
!host.forward_host.startsWith("unix")
) {
const split = host.forward_host.split("/");
host.forward_host = split.shift();
host.forward_path = `/${split.join("/")}`;
}
if (host.domain_names) {
host.server_names = host.domain_names.map((domain_name) => domainToASCII(domain_name) || domain_name);
}
host.env = process.env;
locationsPromise.then(() => {
renderEngine
.parseAndRender(template, host)
.then((config_text) => {
fs.writeFileSync(filename, config_text, { encoding: "utf8" });
debug(logger, "Wrote config:", filename);
// Restore locations array
host.locations = origLocations;
resolve(true);
})
.catch((err) => {
debug(logger, `Could not write ${filename}:`, err.message);
reject(new errs.ConfigurationError(err.message));
})
.then(() => {
if (process.env.DISABLE_NGINX_BEAUTIFIER === "false") {
utils.execFile("nginxbeautifier", ["-s", "4", filename]).catch(() => {});
}
});
});
});
},
/**
* A simple wrapper around unlinkSync that writes to the logger
*
* @param {String} filename
*/
deleteFile: (filename) => {
if (!fs.existsSync(filename)) {
return;
}
try {
debug(logger, `Deleting file: ${filename}`);
fs.unlinkSync(filename);
template = await readFile(`${__dirname}/../templates/${nice_host_type}.conf`, { encoding: "utf8" });
} catch (err) {
debug(logger, "Could not delete file:", JSON.stringify(err, null, 2));
throw new errs.ConfigurationError(err.message);
}
let origLocations;
// Manipulate the data a bit before sending it to the template
if (nice_host_type !== "default") {
host.use_default_location = true;
if (typeof host.advanced_config !== "undefined" && host.advanced_config) {
host.use_default_location = !internalNginx.advancedConfigHasDefaultLocation(host.advanced_config);
}
}
// For redirection hosts, if the scheme is not http or https, set it to $scheme
if (
nice_host_type === "redirection_host" &&
["http", "https"].indexOf(host.forward_scheme.toLowerCase()) === -1
) {
host.forward_scheme = "$scheme";
}
if (host.locations) {
_.map(host.locations, (location) => {
if (location.path === "/" && location.location_type !== "= " && location.npmplus_enabled !== false) {
host.use_default_location = false;
}
if (location.npmplus_auth_request === "anubis") {
host.create_anubis_locations = true;
}
if (location.npmplus_auth_request === "tinyauth") {
host.create_tinyauth_locations = true;
}
if (location.npmplus_auth_request === "authelia") {
host.create_authelia_locations = true;
}
if (
location.npmplus_auth_request === "authentik" ||
location.npmplus_auth_request === "authentik-send-basic-auth"
) {
host.create_authentik_locations = true;
}
});
host.locations = await internalNginx.renderLocations(host);
}
if (
host.forward_host &&
host.forward_host.indexOf("/") > -1 &&
!host.forward_host.startsWith("/") &&
!host.forward_host.startsWith("unix")
) {
const split = host.forward_host.split("/");
host.forward_host = split.shift();
host.forward_path = `/${split.join("/")}`;
}
if (host.domain_names) {
host.server_names = host.domain_names.map((domain_name) => domainToASCII(domain_name) || domain_name);
}
host.env = process.env;
try {
const config_text = await renderEngine.parseAndRender(template, host);
await writeFile(filename, config_text, { encoding: "utf8" });
debug(logger, "Wrote config:", filename);
if (process.env.DISABLE_NGINX_BEAUTIFIER === "false") {
await utils.execFile("nginxbeautifier", ["-s", "4", filename]).catch(() => {});
}
return true;
} catch (err) {
debug(logger, `Could not write ${filename}:`, err.message);
throw new errs.ConfigurationError(err.message);
}
},
@@ -318,17 +267,22 @@ const internalNginx = {
* @param {Object} [host]
* @returns {Promise}
*/
deleteConfig: (host_type, host) => {
deleteConfig: async (host_type, host) => {
const config_file = internalNginx.getConfigName(
internalNginx.getFileFriendlyHostType(host_type),
typeof host === "undefined" ? 0 : host.id,
);
return new Promise((resolve /*, reject*/) => {
internalNginx.deleteFile(config_file);
internalNginx.deleteFile(`${config_file}.err`);
resolve();
});
const filesToDelete = [config_file, `${config_file}.err`];
for (const filename of filesToDelete) {
try {
debug(logger, `Deleting file: ${filename}`);
await rm(filename, { force: true });
} catch (err) {
debug(logger, "Could not delete file:", JSON.stringify(err, null, 2));
}
}
},
/**
@@ -336,17 +290,15 @@ const internalNginx = {
* @param {Object} [host]
* @returns {Promise}
*/
renameConfigAsError: (host_type, host) => {
renameConfigAsError: async (host_type, host) => {
const config_file = internalNginx.getConfigName(
internalNginx.getFileFriendlyHostType(host_type),
typeof host === "undefined" ? 0 : host.id,
);
return new Promise((resolve /*, reject */) => {
fs.rename(config_file, `${config_file}.err`, () => {
resolve();
});
});
try {
await rename(config_file, `${config_file}.err`);
} catch {}
},
/**
@@ -354,14 +306,15 @@ const internalNginx = {
* @param {Array} hosts
* @returns {Promise}
*/
bulkGenerateConfigs: (model, hostType, hosts) => {
const promises = [];
hosts.map((host) => {
promises.push(internalNginx.configure(model, hostType, host));
return true;
});
bulkGenerateConfigs: async (model, hostType, hosts) => {
const results = [];
return Promise.all(promises);
for (const host of hosts) {
const result = await internalNginx.configure(model, hostType, host);
results.push(result);
}
return results;
},
/**

View File

@@ -4,9 +4,12 @@ import { parseDatePeriod } from "../lib/helpers.js";
import authModel from "../models/auth.js";
import TokenModel from "../models/token.js";
import userModel from "../models/user.js";
import twoFactor from "./2fa.js";
const ERROR_MESSAGE_INVALID_AUTH = "Invalid email or password";
const ERROR_MESSAGE_INVALID_AUTH_I18N = "error.invalid-auth";
const ERROR_MESSAGE_INVALID_2FA = "Invalid verification code";
const ERROR_MESSAGE_INVALID_2FA_I18N = "error.invalid-2fa";
export default {
/**
@@ -52,7 +55,26 @@ export default {
throw new errs.AuthError(`Invalid scope: ${data.scope}`);
}
// Create a moment of the expiry expression
// Check if 2FA is enabled
const has2FA = await twoFactor.isEnabled(user.id);
if (has2FA) {
// Return challenge token instead of full token
const challengeToken = await Token.create({
iss: issuer || "api",
attrs: {
id: user.id,
},
scope: ["2fa-challenge"],
expiresIn: "5m",
});
return {
requires_2fa: true,
challenge_token: challengeToken.token,
};
}
// Create a dayjs of the expiry expression
const expiry = parseDatePeriod(data.expiry);
if (expiry === null) {
throw new errs.AuthError(`Invalid expiry time: ${data.expiry}`);
@@ -95,7 +117,7 @@ export default {
throw new errs.AuthError(ERROR_MESSAGE_INVALID_AUTH);
}
// Create a moment of the expiry expression
// Create a dayjs of the expiry expression
const expiry = parseDatePeriod(data.expiry);
if (expiry === null) {
throw new errs.AuthError(`Invalid expiry time: ${data.expiry}`);
@@ -130,7 +152,7 @@ export default {
thisData.expiry = thisData.expiry || "1d";
if (access?.token.getUserId(0)) {
// Create a moment of the expiry expression
// Create a dayjs of the expiry expression
const expiry = parseDatePeriod(thisData.expiry);
if (expiry === null) {
throw new errs.AuthError(`Invalid expiry time: ${thisData.expiry}`);
@@ -165,6 +187,62 @@ export default {
throw new errs.AssertionFailedError("Existing token contained invalid user data");
},
/**
* Verify 2FA code and return full token
* @param {string} challengeToken
* @param {string} code
* @param {string} [expiry]
* @returns {Promise}
*/
verify2FA: async (challengeToken, code, expiry) => {
const Token = TokenModel();
const tokenExpiry = expiry || "1d";
// Verify challenge token
let tokenData;
try {
tokenData = await Token.load(challengeToken);
} catch {
throw new errs.AuthError("Invalid or expired challenge token");
}
// Check scope
if (!tokenData.scope || tokenData.scope[0] !== "2fa-challenge") {
throw new errs.AuthError("Invalid challenge token");
}
const userId = tokenData.attrs?.id;
if (!userId) {
throw new errs.AuthError("Invalid challenge token");
}
// Verify 2FA code
const valid = await twoFactor.verifyForLogin(userId, code);
if (!valid) {
throw new errs.AuthError(ERROR_MESSAGE_INVALID_2FA, ERROR_MESSAGE_INVALID_2FA_I18N);
}
// Create full token
const expiryDate = parseDatePeriod(tokenExpiry);
if (expiryDate === null) {
throw new errs.AuthError(`Invalid expiry time: ${tokenExpiry}`);
}
const signed = await Token.create({
iss: "api",
attrs: {
id: userId,
},
scope: ["user"],
expiresIn: tokenExpiry,
});
return {
token: signed.token,
expires: expiryDate.toISOString(),
};
},
/**
* @param {Object} user
* @returns {Promise}

View File

@@ -51,7 +51,12 @@ const internalUser = {
try {
const hash = crypto.createHash("sha256").update(data.email.trim().toLowerCase()).digest("hex");
const response = await fetch(
`https://www.gravatar.com/avatar/${hash}?s=64&default=initials&name=${encodeURIComponent(data.name)}`,
`https://www.gravatar.com/avatar/${hash}?s=64&default=initials&name=${encodeURIComponent(
data.name
.split(" ")
.map((n) => n[0])
.join(""),
)}`,
{
headers: {
"User-Agent": `NPMplus/${pjson.version}`,
@@ -175,7 +180,12 @@ const internalUser = {
.update((data.email || user.email).trim().toLowerCase())
.digest("hex");
const response = await fetch(
`https://www.gravatar.com/avatar/${hash}?s=64&default=initials&name=${encodeURIComponent(data.name || user.name)}`,
`https://www.gravatar.com/avatar/${hash}?s=64&default=initials&name=${encodeURIComponent(
(data.name || user.name)
.split(" ")
.map((n) => n[0])
.join(""),
)}`,
{
headers: {
"User-Agent": `NPMplus/${pjson.version}`,

View File

@@ -3,40 +3,13 @@ import crypto from "node:crypto";
import { global as logger } from "../logger.js";
const keysFile = "/data/npmplus/keys.json";
const sqliteEngine = "better-sqlite3";
const mysqlEngine = "mysql2";
const postgresEngine = "pg";
const sqliteClientName = "better-sqlite3";
let instance = null;
// 1. Load from config file first (not recommended anymore)
// 2. Use config env variables next
const configure = () => {
const filename = "/data/npmplus/default.json";
if (fs.existsSync(filename)) {
let configData;
try {
// Load this json synchronously
const rawData = fs.readFileSync(filename);
configData = JSON.parse(rawData);
} catch (_) {
// do nothing
}
if (configData?.database) {
logger.info(`Using configuration from file: ${filename}`);
// Migrate those who have "mysql" engine to "mysql2"
if (configData.database.engine === "mysql") {
configData.database.engine = mysqlEngine;
}
instance = configData;
instance.keys = getKeys();
return;
}
}
const toBool = (v) => /^(1|true|yes|on)$/i.test((v || "").trim());
const envMysqlHost = process.env.DB_MYSQL_HOST || null;
@@ -92,12 +65,13 @@ const configure = () => {
}
const envSqliteFile = "/data/npmplus/database.sqlite";
logger.info(`Using Sqlite: ${envSqliteFile}`);
instance = {
database: {
engine: "knex-native",
knex: {
client: sqliteClientName,
client: sqliteEngine,
connection: {
filename: envSqliteFile,
},
@@ -194,7 +168,7 @@ const configGet = (key) => {
*/
const isSqlite = () => {
instance === null && configure();
return instance.database.knex && instance.database.knex.client === sqliteClientName;
return instance.database.knex && instance.database.knex.client === sqliteEngine;
};
/**

View File

@@ -2,14 +2,25 @@ import Access from "../access.js";
export default () => {
return async (req, res, next) => {
const token = req.cookies?.token || null;
//if (!token) {
// return res.status(401).json({
// error: {
// message: "Missing token",
// },
// });
//}
try {
res.locals.access = null;
const access = new Access(req.cookies?.token || null);
const access = new Access(token);
await access.load();
res.locals.access = access;
next();
} catch {
return res.status(401).json({
res.clearCookie("token", { path: "/api" });
return res.status(403).json({
error: {
message: "Invalid or expired token",
},

View File

@@ -1,9 +1,9 @@
import moment from "moment";
import dayjs from "dayjs";
import { ref } from "objection";
import { isPostgres } from "./config.js";
/**
* Takes an expression such as 30d and returns a moment object of that date in future
* Takes an expression such as 30d and returns a dayjs object of that date in future
*
* Key Shorthand
* ==================
@@ -23,7 +23,7 @@ import { isPostgres } from "./config.js";
const parseDatePeriod = (expression) => {
const matches = expression.match(/^([0-9]+)(y|Q|M|w|d|h|m|s|ms)$/m);
if (matches) {
return moment().add(matches[1], matches[2]);
return dayjs().add(matches[1], matches[2]);
}
return null;

View File

@@ -1,4 +1,5 @@
import { execFile as nodeExecFile } from "node:child_process";
import { promisify } from "node:util";
import { dirname } from "node:path";
import { fileURLToPath } from "node:url";
import { Liquid } from "liquidjs";
@@ -11,6 +12,8 @@ import errs from "./error.js";
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);
const nodeExecFilePromises = promisify(nodeExecFile);
const writeHash = () => {
const envVars = fs.readdirSync(`${__dirname}/../templates`).flatMap((file) => {
const content = fs.readFileSync(`${__dirname}/../templates/${file}`, "utf8");
@@ -31,18 +34,18 @@ const writeHash = () => {
* @param {Array} args
* @returns {Promise}
*/
const execFile = (cmd, args) => {
const execFile = async (cmd, args) => {
debug(logger, `CMD: ${cmd} ${args ? args.join(" ") : ""}`);
return new Promise((resolve, reject) => {
nodeExecFile(cmd, args, (err, stdout, stderr) => {
if (err && typeof err === "object") {
reject(new errs.CommandError((stdout + stderr).trim(), 1, err));
} else {
resolve((stdout + stderr).trim());
}
});
});
try {
const { stdout, stderr } = await nodeExecFilePromises(cmd, args);
return `${stdout || ""}${stderr || ""}`.trim();
} catch (err) {
if (err && typeof err === "object") {
throw new errs.CommandError(`${err.stdout || ""}${err.stderr || ""}`.trim(), 1, err);
}
throw err;
}
};
/**

View File

@@ -9,7 +9,7 @@ const migrate = new signale.Signale({ scope: "Migrate ", ...opts });
const express = new signale.Signale({ scope: "Express ", ...opts });
const access = new signale.Signale({ scope: "Access ", ...opts });
const nginx = new signale.Signale({ scope: "Nginx ", ...opts });
const ssl = new signale.Signale({ scope: "SSL ", ...opts });
const ssl = new signale.Signale({ scope: "TLS ", ...opts });
const certbot = new signale.Signale({ scope: "Certbot ", ...opts });
const importer = new signale.Signale({ scope: "Importer ", ...opts });
const setup = new signale.Signale({ scope: "Setup ", ...opts });

View File

@@ -0,0 +1,23 @@
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (_knex) => {
return Promise.resolve(true);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
return Promise.resolve(true);
};
export { up, down };

View File

@@ -5,7 +5,7 @@ const migrateName = "initial-schema";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "websockets";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "forward_host";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "http2_support";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "forward_scheme";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "disabled";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -6,7 +6,7 @@ const migrateName = "custom_locations";
* Migrate
* Extends proxy_host table with locations field
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "hsts";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "settings";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "access_list_client";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "access_list_client_fix";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "pass_auth";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "redirection_scheme";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "redirection_status_code";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "stream_domain";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -26,7 +26,7 @@ async function regenerateDefaultHost(knex) {
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "stream_ssl";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "change_incoming_port_to_string";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -26,7 +26,7 @@ async function regenerateDefaultHost(knex) {
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "change_forwarding_port_to_string";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "allow_empty_forwarding_port";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "allow_empty_stream_forwarding_port";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -1,11 +1,11 @@
import { migrate as logger } from "../logger.js";
const migrateName = "20250627140440_stream_proxy_protocol_forwarding";
const migrateName = "stream_proxy_protocol_forwarding";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -5,7 +5,7 @@ const migrateName = "redirect_auto_scheme";
/**
* Migrate
*
* @see http://knexjs.org/#Schema
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}

View File

@@ -0,0 +1,36 @@
import { migrate as logger } from "../logger.js";
const migrateName = "stream_proxy_ssl";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (knex) => {
logger.info(`[${migrateName}] Migrating Up...`);
return knex.schema
.table("stream", (stream) => {
stream.integer("proxy_ssl").notNull().unsigned().defaultTo(0);
})
.then(() => {
logger.info(`[${migrateName}] stream Table altered`);
});
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,37 @@
import { migrate as logger } from "../logger.js";
const migrateName = "stream_rename_pp_and_tls";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (knex) => {
logger.info(`[${migrateName}] Migrating Up...`);
return knex.schema
.table("stream", (stream) => {
stream.renameColumn("proxy_protocol_forwarding", "npmplus_proxy_protocol_forwarding");
stream.renameColumn("proxy_ssl", "npmplus_proxy_tls");
})
.then(() => {
logger.info(`[${migrateName}] stream Table altered`);
});
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,43 @@
import { migrate as logger } from "../logger.js";
const migrateName = "trust_forwarded_proto";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (knex) => {
logger.info(`[${migrateName}] Migrating Up...`);
return knex.schema
.alterTable("proxy_host", (table) => {
table.tinyint("trust_forwarded_proto").notNullable().defaultTo(0);
})
.then(() => {
logger.info(`[${migrateName}] proxy_host Table altered`);
});
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (knex) => {
logger.info(`[${migrateName}] Migrating Down...`);
return knex.schema
.alterTable("proxy_host", (table) => {
table.dropColumn("trust_forwarded_proto");
})
.then(() => {
logger.info(`[${migrateName}] proxy_host Table altered`);
});
};
export { up, down };

View File

@@ -0,0 +1,36 @@
import { migrate as logger } from "../logger.js";
const migrateName = "reset_button_values";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = async (knex) => {
logger.info(`[${migrateName}] Migrating Up...`);
await knex("proxy_host").update({
caching_enabled: 0,
block_exploits: 0,
allow_websocket_upgrade: 0,
});
logger.info(`[${migrateName}] proxy_host values reset`);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,23 @@
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (_knex) => {
return Promise.resolve(true);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,57 @@
import { migrate as logger } from "../logger.js";
const migrateName = "new_proxy_buttons";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = async (knex) => {
const hasRequestBuffering = await knex.schema.hasColumn("proxy_host", "npmplus_proxy_request_buffering");
const hasResponseBuffering = await knex.schema.hasColumn("proxy_host", "npmplus_proxy_response_buffering");
const hasFancyindexUpstreamCompression = await knex.schema.hasColumn(
"proxy_host",
"npmplus_fancyindex_upstream_compression",
);
const hasNoindex = await knex.schema.hasColumn("proxy_host", "npmplus_noindex");
if (hasRequestBuffering && hasResponseBuffering && hasFancyindexUpstreamCompression && hasNoindex) {
return;
}
logger.info(`[${migrateName}] Migrating Up...`);
await knex.schema.table("proxy_host", (proxy_host) => {
if (!hasRequestBuffering) {
proxy_host.integer("npmplus_proxy_request_buffering").notNull().unsigned().defaultTo(0);
}
if (!hasResponseBuffering) {
proxy_host.integer("npmplus_proxy_response_buffering").notNull().unsigned().defaultTo(0);
}
if (!hasFancyindexUpstreamCompression) {
proxy_host.integer("npmplus_fancyindex_upstream_compression").notNull().unsigned().defaultTo(0);
}
if (!hasNoindex) {
proxy_host.integer("npmplus_noindex").notNull().unsigned().defaultTo(0);
}
});
logger.info(`[${migrateName}] proxy_host Table altered`);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,37 @@
import { migrate as logger } from "../logger.js";
const migrateName = "new_proxy_selections";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (knex) => {
logger.info(`[${migrateName}] Migrating Up...`);
return knex.schema
.table("proxy_host", (proxy_host) => {
proxy_host.string("npmplus_x_frame_options").notNull().defaultTo("SAMEORIGIN");
proxy_host.string("npmplus_auth_request").notNull().defaultTo("none");
})
.then(() => {
logger.info(`[${migrateName}] proxy_host Table altered`);
});
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,57 @@
import { migrate as logger } from "../logger.js";
const migrateName = "npmplus_http3_support";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = async (knex) => {
const proxyHostHasHttp3Column = await knex.schema.hasColumn("proxy_host", "npmplus_http3_support");
const redirectHostHasHttp3Column = await knex.schema.hasColumn("redirection_host", "npmplus_http3_support");
const deadHostHasHttp3Column = await knex.schema.hasColumn("dead_host", "npmplus_http3_support");
if (proxyHostHasHttp3Column && redirectHostHasHttp3Column && deadHostHasHttp3Column) {
return;
}
logger.info(`[${migrateName}] Migrating Up...`);
if (!proxyHostHasHttp3Column) {
await knex.schema.table("proxy_host", (proxy_host) => {
proxy_host.integer("npmplus_http3_support").notNull().unsigned().defaultTo(0);
});
logger.info(`[${migrateName}] proxy_host Table altered`);
}
if (!redirectHostHasHttp3Column) {
await knex.schema.table("redirection_host", (redirection_host) => {
redirection_host.integer("npmplus_http3_support").notNull().unsigned().defaultTo(0);
});
logger.info(`[${migrateName}] redirection_host Table altered`);
}
if (!deadHostHasHttp3Column) {
await knex.schema.table("dead_host", (dead_host) => {
dead_host.integer("npmplus_http3_support").notNull().unsigned().defaultTo(0);
});
logger.info(`[${migrateName}] dead_host Table altered`);
}
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,23 @@
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (_knex) => {
return Promise.resolve(true);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,57 @@
import { migrate as logger } from "../logger.js";
const migrateName = "new_and_split_proxy_buttons";
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = async (knex) => {
const hasCrowdsecAppsec = await knex.schema.hasColumn("proxy_host", "npmplus_crowdsec_appsec");
const hasUpstreamCompression = await knex.schema.hasColumn("proxy_host", "npmplus_upstream_compression");
const hasFancyindex = await knex.schema.hasColumn("proxy_host", "npmplus_fancyindex");
const hasFancyindexUpstreamCompression = await knex.schema.hasColumn(
"proxy_host",
"npmplus_fancyindex_upstream_compression",
);
if (hasCrowdsecAppsec && hasUpstreamCompression && hasFancyindex && !hasFancyindexUpstreamCompression) {
return;
}
logger.info(`[${migrateName}] Migrating Up...`);
await knex.schema.table("proxy_host", (proxy_host) => {
if (!hasCrowdsecAppsec) {
proxy_host.integer("npmplus_crowdsec_appsec").notNull().unsigned().defaultTo(0);
}
if (!hasUpstreamCompression) {
proxy_host.integer("npmplus_upstream_compression").notNull().unsigned().defaultTo(0);
}
if (!hasFancyindex) {
proxy_host.integer("npmplus_fancyindex").notNull().unsigned().defaultTo(0);
}
if (hasFancyindexUpstreamCompression) {
proxy_host.dropColumn("npmplus_fancyindex_upstream_compression");
}
});
logger.info(`[${migrateName}] proxy_host Table altered`);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
logger.warn(`[${migrateName}] You can't migrate down this one.`);
return Promise.resolve(true);
};
export { up, down };

View File

@@ -0,0 +1,23 @@
/**
* Migrate
*
* @see https://knexjs.org/guide/migrations.html#migration-api
*
* @param {Object} knex
* @returns {Promise}
*/
const up = (_knex) => {
return Promise.resolve(true);
};
/**
* Undo Migrate
*
* @param {Object} knex
* @returns {Promise}
*/
const down = (_knex) => {
return Promise.resolve(true);
};
export { up, down };

View File

@@ -10,7 +10,15 @@ import User from "./user.js";
Model.knex(db());
const boolFields = ["is_deleted", "ssl_forced", "http2_support", "enabled", "hsts_enabled", "hsts_subdomains"];
const boolFields = [
"is_deleted",
"ssl_forced",
"http2_support",
"npmplus_http3_support",
"enabled",
"hsts_enabled",
"hsts_subdomains",
];
class DeadHost extends Model {
$beforeInsert() {

View File

@@ -18,9 +18,17 @@ const boolFields = [
"block_exploits",
"allow_websocket_upgrade",
"http2_support",
"npmplus_http3_support",
"enabled",
"hsts_enabled",
"hsts_subdomains",
"trust_forwarded_proto",
"npmplus_noindex",
"npmplus_crowdsec_appsec",
"npmplus_proxy_request_buffering",
"npmplus_proxy_response_buffering",
"npmplus_upstream_compression",
"npmplus_fancyindex",
];
class ProxyHost extends Model {

View File

@@ -19,6 +19,7 @@ const boolFields = [
"hsts_enabled",
"hsts_subdomains",
"http2_support",
"npmplus_http3_support",
];
class RedirectionHost extends Model {

View File

@@ -7,7 +7,14 @@ import User from "./user.js";
Model.knex(db());
const boolFields = ["is_deleted", "enabled", "tcp_forwarding", "udp_forwarding", "proxy_protocol_forwarding"];
const boolFields = [
"is_deleted",
"enabled",
"tcp_forwarding",
"udp_forwarding",
"npmplus_proxy_protocol_forwarding",
"npmplus_proxy_tls",
];
class Stream extends Model {
$beforeInsert() {

View File

@@ -3,32 +3,34 @@
"version": "0.0.0",
"description": "A beautiful interface for creating Nginx endpoints",
"author": "Jamie Curnow <jc@jc21.com> and ZoeyVid <zoeyvid@zvcdn.de>",
"license": "MIT",
"license": "AGPL-3.0 (and MIT)",
"main": "index.js",
"type": "module",
"dependencies": {
"@apidevtools/json-schema-ref-parser": "15.1.3",
"ajv": "8.17.1",
"@apidevtools/json-schema-ref-parser": "15.3.1",
"ajv": "8.18.0",
"archiver": "7.0.1",
"bcryptjs": "3.0.3",
"better-sqlite3": "12.5.0",
"better-sqlite3": "12.6.2",
"cookie-parser": "1.4.7",
"dayjs": "1.11.19",
"express": "5.2.1",
"express-fileupload": "1.5.2",
"express-rate-limit": "8.2.1",
"jsonwebtoken": "9.0.3",
"knex": "3.1.0",
"liquidjs": "10.24.0",
"lodash": "4.17.21",
"moment": "2.30.1",
"mysql2": "3.16.0",
"lodash": "4.17.23",
"mysql2": "3.18.2",
"objection": "3.1.5",
"openid-client": "6.8.1",
"pg": "8.16.3",
"openid-client": "6.8.2",
"otplib": "13.3.0",
"pg": "8.19.0",
"signale": "1.4.0",
"temp-write": "6.0.0"
"swagger-ui-express": "5.0.1"
},
"devDependencies": {
"@apidevtools/swagger-parser": "12.1.0",
"@biomejs/biome": "2.3.11"
"@biomejs/biome": "2.4.5"
}
}

View File

@@ -2,7 +2,7 @@
// based on: https://github.com/jlesage/docker-nginx-proxy-manager/blob/796734a3f9a87e0b1561b47fd418f82216359634/rootfs/opt/nginx-proxy-manager/bin/reset-password
import fs from "node:fs";
import { existsSync } from "node:fs";
import bcrypt from "bcryptjs";
import Database from "better-sqlite3";
@@ -13,7 +13,7 @@ Reset password of a NPMplus user.
Arguments:
USER_EMAIL Email address of the user to reset the password.
PASSWORD Optional new password of the user. If not set, password is set to 'changeme'.`);
PASSWORD New password of the user.`);
process.exit(1);
}
@@ -21,57 +21,35 @@ const args = process.argv.slice(2);
const USER_EMAIL = args[0];
const PASSWORD = args[1];
if (!USER_EMAIL && !PASSWORD) {
console.error("ERROR: User email address must be set.");
console.error("ERROR: Password must be set.");
if (!USER_EMAIL || !PASSWORD) {
if (!USER_EMAIL) console.error("ERROR: User email address must be set.");
if (!PASSWORD) console.error("ERROR: Password must be set.");
usage();
}
if (!USER_EMAIL) {
console.error("ERROR: User email address must be set.");
usage();
}
if (!PASSWORD) {
console.error("ERROR: Password must be set.");
usage();
}
if (fs.existsSync("/data/npmplus/database.sqlite")) {
bcrypt.hash(PASSWORD, 13, (err, PASSWORD_HASH) => {
if (err) {
console.error(err);
process.exit(1);
}
const db = new Database("/data/npmplus/database.sqlite");
try {
const stmt = db.prepare(`
UPDATE auth
SET secret = ?
WHERE EXISTS (
SELECT *
FROM user
WHERE user.id = auth.user_id AND user.email = ?
)`);
const result = stmt.run(PASSWORD_HASH, USER_EMAIL);
if (result.changes > 0) {
console.log(`Password for user ${USER_EMAIL} has been reset.`);
} else {
console.log(`No user found with email ${USER_EMAIL}.`);
}
} catch (error) {
console.error(error);
process.exit(1);
} finally {
db.close();
}
process.exit(0);
});
} else {
if (!existsSync("/data/npmplus/database.sqlite")) {
console.error("ERROR: Cannot connect to the sqlite database.");
process.exit(1);
}
let db;
try {
db = new Database("/data/npmplus/database.sqlite");
const PASSWORD_HASH = bcrypt.hashSync(PASSWORD, 13);
const stmt = db.prepare(
"UPDATE auth SET secret = ? WHERE EXISTS (SELECT * FROM user WHERE user.id = auth.user_id AND user.email = ?)",
);
const result = stmt.run(PASSWORD_HASH, USER_EMAIL);
if (result.changes > 0) {
console.log(`Password for user ${USER_EMAIL} has been reset.`);
} else {
console.log(`No user found with email ${USER_EMAIL}.`);
}
} catch (error) {
console.error(error);
process.exitCode = 1;
} finally {
if (db) db.close();
}

629
backend/pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

36
backend/routes/docs.js Normal file
View File

@@ -0,0 +1,36 @@
import express from "express";
import swaggerUi from "swagger-ui-express";
import { debug, express as logger } from "../logger.js";
import PACKAGE from "../package.json" with { type: "json" };
import { getCompiledSchema } from "../schema/index.js";
const router = express.Router({
caseSensitive: true,
strict: true,
mergeParams: true,
});
router.use("/", swaggerUi.serve);
router
.route("/")
.options((_, res) => {
res.sendStatus(204);
})
/**
* GET / (Now serves the Swagger UI interface)
*/
.get(async (req, res, next) => {
try {
const swaggerJSON = await getCompiledSchema();
swaggerJSON.info.version = PACKAGE.version;
swaggerJSON.servers[0].url = `${req.protocol}://${req.get("host")}/api`;
res.status(200).send(swaggerUi.generateHTML(swaggerJSON));
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.originalUrl}: ${err}`);
next(err);
}
});
export default router;

View File

@@ -10,6 +10,7 @@ import proxyHostsRoutes from "./nginx/proxy_hosts.js";
import redirectionHostsRoutes from "./nginx/redirection_hosts.js";
import streamsRoutes from "./nginx/streams.js";
import reportsRoutes from "./reports.js";
import docsRoutes from "./docs.js";
import schemaRoutes from "./schema.js";
import settingsRoutes from "./settings.js";
import tokensRoutes from "./tokens.js";
@@ -44,6 +45,7 @@ router.get(["/api", "/api/"], async (_, res /*, next*/) => {
});
});
router.use("/api/docs", docsRoutes);
router.use("/api/schema", schemaRoutes);
router.use("/api/tokens", tokensRoutes);
if (isOIDCenabled) router.use("/api/oidc", oidcRoutes);

View File

@@ -155,8 +155,8 @@ router
* Validate certificates
*/
.post(async (req, res, next) => {
if (!req.files) {
res.status(400).send({ error: "No files were uploaded" });
if (!req.files || Object.keys(req.files).length !== 2 || !req.files.certificate || !req.files.certificate_key) {
res.status(400).send({ error: "certificate and certificate_key were not uploaded" });
return;
}
@@ -254,8 +254,8 @@ router
* Upload certificates
*/
.post(async (req, res, next) => {
if (!req.files) {
res.status(400).send({ error: "No files were uploaded" });
if (!req.files || Object.keys(req.files).length !== 2 || !req.files.certificate || !req.files.certificate_key) {
res.status(400).send({ error: "certificate and certificate_key were not uploaded" });
return;
}

View File

@@ -1,5 +1,6 @@
import * as client from "openid-client";
import express from "express";
import { rateLimit } from "express-rate-limit";
import errs from "../lib/error.js";
import internalToken from "../internal/token.js";
import { oidc as logger } from "../logger.js";
@@ -10,6 +11,18 @@ const router = express.Router({
mergeParams: true,
});
const limiter = rateLimit({
windowMs: 10 * 60 * 1000,
limit: 10,
standardHeaders: "draft-8",
legacyHeaders: false,
ipv6Subnet: 64,
skipSuccessfulRequests: true,
validate: { trustProxy: false },
});
router.use(limiter);
router
.route("/")
.options((_, res) => {
@@ -39,28 +52,30 @@ router
code_challenge: await client.calculatePKCECodeChallenge(code_verifier),
};
res.cookie("npmplus_oidc_no_redirect", "true", { secure: true, sameSite: "Strict" });
res.cookie("npmplus_oidc_code_verifier", code_verifier, {
httpOnly: true,
secure: true,
sameSite: "lax",
sameSite: "Lax",
path: "/api/oidc",
});
res.cookie("npmplus_oidc_state", parameters.state, {
httpOnly: true,
secure: true,
sameSite: "lax",
sameSite: "Lax",
path: "/api/oidc",
});
res.cookie("npmplus_oidc_nonce", parameters.nonce, {
httpOnly: true,
secure: true,
sameSite: "lax",
sameSite: "Lax",
path: "/api/oidc",
});
res.redirect(await client.buildAuthorizationUrl(config, parameters).toString());
} catch (err) {
logger.error(`Callback error: ${err.message}`);
res.cookie("npmplus_oidc_no_redirect", "true", { secure: true, sameSite: "Strict" });
res.clearCookie("npmplus_oidc_state", { path: "/api/oidc" });
res.clearCookie("npmplus_oidc_nonce", { path: "/api/oidc" });
res.clearCookie("npmplus_oidc_code_verifier", { path: "/api/oidc" });
@@ -113,11 +128,12 @@ router
res.cookie("token", data.token, {
httpOnly: true,
secure: true,
sameSite: "lax",
sameSite: "Strict",
path: "/api",
expires: new Date(data.expires),
});
res.clearCookie("npmplus_oidc_no_redirect");
res.clearCookie("npmplus_oidc_state", { path: "/api/oidc" });
res.clearCookie("npmplus_oidc_nonce", { path: "/api/oidc" });
res.clearCookie("npmplus_oidc_code_verifier", { path: "/api/oidc" });

View File

@@ -18,22 +18,11 @@ router
/**
* GET /schema
*/
.get(async (req, res) => {
.get(async (req, res, next) => {
try {
const swaggerJSON = await getCompiledSchema();
let proto = req.protocol;
if (typeof req.headers["x-forwarded-proto"] !== "undefined" && req.headers["x-forwarded-proto"]) {
proto = req.headers["x-forwarded-proto"];
}
let origin = `${proto}://${req.hostname}`;
if (typeof req.headers.origin !== "undefined" && req.headers.origin) {
origin = req.headers.origin;
}
swaggerJSON.info.version = PACKAGE.version;
swaggerJSON.servers[0].url = `${origin}/api`;
swaggerJSON.servers[0].url = `${req.protocol}://${req.get("host")}/api`;
res.status(200).send(swaggerJSON);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.originalUrl}: ${err}`);

View File

@@ -1,4 +1,5 @@
import express from "express";
import { rateLimit } from "express-rate-limit";
import internalToken from "../internal/token.js";
import errs from "../lib/error.js";
import jwtdecode from "../lib/express/jwt-decode.js";
@@ -12,6 +13,19 @@ const router = express.Router({
mergeParams: true,
});
const limiter = rateLimit({
windowMs: 5 * 60 * 1000,
limit: 10,
message: { error: { message: "Too many requests, please try again later." } },
standardHeaders: "draft-8",
legacyHeaders: false,
ipv6Subnet: 64,
skipSuccessfulRequests: true,
validate: { trustProxy: false },
});
router.use(limiter);
router
.route("/")
.options((_, res) => {
@@ -26,6 +40,12 @@ router
* for services like Job board and Worker.
*/
.get(jwtdecode(), async (req, res, next) => {
if (!req.cookies?.token) {
res.clearCookie("token", { path: "/api" });
res.cookie("npmplus_oidc_no_redirect", "true", { secure: true, sameSite: "Strict" });
return res.status(401).send({ expires: new Date(0).toISOString() });
}
try {
const data = await internalToken.getFreshToken(res.locals.access, {
expiry: typeof req.query.expiry !== "undefined" ? req.query.expiry : null,
@@ -35,7 +55,7 @@ router
res.cookie("token", data.token, {
httpOnly: true,
secure: true,
sameSite: "lax",
sameSite: "Strict",
path: "/api",
expires: new Date(data.expires),
});
@@ -60,16 +80,19 @@ router
const data = await apiValidator(getValidationSchema("/tokens", "post"), req.body);
const result = await internalToken.getTokenFromEmail(data);
const { token, ...responseBody } = result;
res.cookie("token", result.token, {
httpOnly: true,
secure: true,
sameSite: "lax",
path: "/api",
expires: new Date(result.expires),
});
if (result.token && result.expires) {
res.cookie("token", result.token, {
httpOnly: true,
secure: true,
sameSite: "Strict",
path: "/api",
expires: new Date(result.expires),
});
}
res.status(200).send({ expires: result.expires });
res.status(200).send(responseBody);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.originalUrl}: ${err}`);
next(err);
@@ -84,11 +107,50 @@ router
.delete(async (req, res, next) => {
try {
res.clearCookie("token", { path: "/api" });
res.status(200).send(true);
res.cookie("npmplus_oidc_no_redirect", "true", { secure: true, sameSite: "Strict" });
res.status(200).send({ expires: new Date(0).toISOString() });
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.originalUrl}: ${err}`);
next(err);
}
});
router
.route("/2fa")
.options((_, res) => {
res.sendStatus(204);
})
/**
* POST /tokens/2fa
*
* Verify 2FA code and get full token
*/
.post(async (req, res, next) => {
try {
if (process.env.OIDC_DISABLE_PASSWORD === "true") {
throw new errs.AuthError("Non OIDC login is disabled");
}
const { challenge_token, code } = await apiValidator(getValidationSchema("/tokens/2fa", "post"), req.body);
const result = await internalToken.verify2FA(challenge_token, code);
const { token, ...responseBody } = result;
if (result.token && result.expires) {
res.cookie("token", result.token, {
httpOnly: true,
secure: true,
sameSite: "lax",
path: "/api",
expires: new Date(result.expires),
});
}
res.status(200).send(responseBody);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.path}: ${err}`);
next(err);
}
});
export default router;

View File

@@ -1,4 +1,5 @@
import express from "express";
import internal2FA from "../internal/2fa.js";
import internalUser from "../internal/user.js";
import Access from "../lib/access.js";
import { isCI } from "../lib/config.js";
@@ -290,11 +291,146 @@ router
const result = await internalUser.loginAs(res.locals.access, {
id: Number.parseInt(req.params.user_id, 10),
});
res.status(200).send(result);
const { token, ...responseBody } = result;
if (result.token && result.expires) {
res.cookie("token", result.token, {
httpOnly: true,
secure: true,
sameSite: "Strict",
path: "/api",
expires: new Date(result.expires),
});
}
res.status(200).send(responseBody);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.originalUrl}: ${err}`);
next(err);
}
});
/**
* User 2FA status
*
* /api/users/123/2fa
*/
router
.route("/:user_id/2fa")
.options((_, res) => {
res.sendStatus(204);
})
.all(jwtdecode())
.all(userIdFromMe)
/**
* POST /api/users/123/2fa
*
* Start 2FA setup, returns QR code URL
*/
.post(async (req, res, next) => {
try {
const result = await internal2FA.startSetup(res.locals.access, req.params.user_id);
res.status(200).send(result);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.path}: ${err}`);
next(err);
}
})
/**
* GET /api/users/123/2fa
*
* Get 2FA status for a user
*/
.get(async (req, res, next) => {
try {
const status = await internal2FA.getStatus(res.locals.access, req.params.user_id);
res.status(200).send(status);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.path}: ${err}`);
next(err);
}
})
/**
* DELETE /api/users/123/2fa?code=XXXXXX
*
* Disable 2FA for a user
*/
.delete(async (req, res, next) => {
try {
const code = typeof req.query.code === "string" ? req.query.code : null;
if (!code) {
throw new errs.ValidationError("Missing required parameter: code");
}
await internal2FA.disable(res.locals.access, req.params.user_id, code);
res.status(200).send(true);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.path}: ${err}`);
next(err);
}
});
/**
* User 2FA enable
*
* /api/users/123/2fa/enable
*/
router
.route("/:user_id/2fa/enable")
.options((_, res) => {
res.sendStatus(204);
})
.all(jwtdecode())
.all(userIdFromMe)
/**
* POST /api/users/123/2fa/enable
*
* Verify code and enable 2FA
*/
.post(async (req, res, next) => {
try {
const { code } = await apiValidator(getValidationSchema("/users/{userID}/2fa/enable", "post"), req.body);
const result = await internal2FA.enable(res.locals.access, req.params.user_id, code);
res.status(200).send(result);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.path}: ${err}`);
next(err);
}
});
/**
* User 2FA backup codes
*
* /api/users/123/2fa/backup-codes
*/
router
.route("/:user_id/2fa/backup-codes")
.options((_, res) => {
res.sendStatus(204);
})
.all(jwtdecode())
.all(userIdFromMe)
/**
* POST /api/users/123/2fa/backup-codes
*
* Regenerate backup codes
*/
.post(async (req, res, next) => {
try {
const { code } = await apiValidator(
getValidationSchema("/users/{userID}/2fa/backup-codes", "post"),
req.body,
);
const result = await internal2FA.regenerateBackupCodes(res.locals.access, req.params.user_id, code);
res.status(200).send(result);
} catch (err) {
debug(logger, `${req.method.toUpperCase()} ${req.path}: ${err}`);
next(err);
}
});
export default router;

View File

@@ -121,16 +121,56 @@
"type": "boolean",
"example": true
},
"npmplus_http3_support": {
"description": "HTTP3 Protocol Support",
"type": "boolean",
"example": true
},
"block_exploits": {
"description": "Should we block common exploits",
"description": "This is always disabled. Your value will be ignored",
"type": "boolean",
"example": false
},
"caching_enabled": {
"description": "Should we cache assets",
"description": "This is always disabled. Your value will be ignored",
"type": "boolean",
"example": false
},
"allow_websocket_upgrade": {
"description": "This is always enabled. Your value will be ignored",
"type": "boolean",
"example": true
},
"npmplus_noindex": {
"description": "Send noindex header and block some user agents",
"type": "boolean",
"example": false
},
"npmplus_crowdsec_appsec": {
"description": "Disable Crowdsec Appsec",
"type": "boolean",
"example": false
},
"npmplus_proxy_request_buffering": {
"description": "Disable Request Buffering",
"type": "boolean",
"example": false
},
"npmplus_proxy_response_buffering": {
"description": "Disable Response Buffering",
"type": "boolean",
"example": false
},
"npmplus_upstream_compression": {
"description": "Enable compression by upstream",
"type": "boolean",
"example": false
},
"npmplus_fancyindex": {
"description": "Enable fancyindex",
"type": "boolean",
"example": false
},
"email": {
"description": "Email address",
"type": "string",
@@ -220,10 +260,6 @@
"certificate_key": {
"type": "string",
"example": "-----BEGIN CERTIFICATE-----\nMIID...-----END CERTIFICATE-----"
},
"intermediate_certificate": {
"type": "string",
"example": "-----BEGIN CERTIFICATE-----\nMIID...-----END CERTIFICATE-----"
}
}
},

View File

@@ -12,6 +12,7 @@
"hsts_enabled",
"hsts_subdomains",
"http2_support",
"npmplus_http3_support",
"advanced_config",
"enabled",
"meta"
@@ -48,6 +49,9 @@
"http2_support": {
"$ref": "../common.json#/properties/http2_support"
},
"npmplus_http3_support": {
"$ref": "../common.json#/properties/npmplus_http3_support"
},
"advanced_config": {
"type": "string",
"example": ""

View File

@@ -18,11 +18,21 @@
"meta",
"allow_websocket_upgrade",
"http2_support",
"npmplus_http3_support",
"forward_scheme",
"enabled",
"locations",
"hsts_enabled",
"hsts_subdomains"
"hsts_subdomains",
"trust_forwarded_proto",
"npmplus_noindex",
"npmplus_crowdsec_appsec",
"npmplus_proxy_request_buffering",
"npmplus_proxy_response_buffering",
"npmplus_upstream_compression",
"npmplus_fancyindex",
"npmplus_x_frame_options",
"npmplus_auth_request"
],
"properties": {
"id": {
@@ -68,6 +78,49 @@
"block_exploits": {
"$ref": "../common.json#/properties/block_exploits"
},
"allow_websocket_upgrade": {
"$ref": "../common.json#/properties/allow_websocket_upgrade"
},
"npmplus_noindex": {
"$ref": "../common.json#/properties/npmplus_noindex"
},
"npmplus_crowdsec_appsec": {
"$ref": "../common.json#/properties/npmplus_crowdsec_appsec"
},
"npmplus_proxy_request_buffering": {
"$ref": "../common.json#/properties/npmplus_proxy_request_buffering"
},
"npmplus_proxy_response_buffering": {
"$ref": "../common.json#/properties/npmplus_proxy_response_buffering"
},
"npmplus_upstream_compression": {
"$ref": "../common.json#/properties/npmplus_upstream_compression"
},
"npmplus_fancyindex": {
"$ref": "../common.json#/properties/npmplus_fancyindex"
},
"npmplus_x_frame_options": {
"type": "string",
"enum": [
"DENY",
"SAMEORIGIN",
"upstream",
"none"
],
"example": "DENY"
},
"npmplus_auth_request": {
"type": "string",
"enum": [
"none",
"anubis",
"tinyauth",
"authelia",
"authentik",
"authentik-send-basic-auth"
],
"example": "none"
},
"advanced_config": {
"type": "string",
"example": ""
@@ -79,14 +132,12 @@
"nginx_err": null
}
},
"allow_websocket_upgrade": {
"description": "Allow Websocket Upgrade for all paths",
"type": "boolean",
"example": true
},
"http2_support": {
"$ref": "../common.json#/properties/http2_support"
},
"npmplus_http3_support": {
"$ref": "../common.json#/properties/npmplus_http3_support"
},
"forward_scheme": {
"type": "string",
"enum": [
@@ -121,6 +172,9 @@
"null"
]
},
"npmplus_enabled": {
"$ref": "../common.json#/properties/enabled"
},
"path": {
"type": "string",
"minLength": 1
@@ -154,6 +208,35 @@
"allow_websocket_upgrade": {
"$ref": "#/properties/allow_websocket_upgrade"
},
"npmplus_fancyindex_upstream_compression": {
"description": "This just exists so that old custom locations don't break",
"type": "boolean",
"example": false
},
"npmplus_noindex": {
"$ref": "#/properties/npmplus_noindex"
},
"npmplus_crowdsec_appsec": {
"$ref": "#/properties/npmplus_crowdsec_appsec"
},
"npmplus_proxy_request_buffering": {
"$ref": "#/properties/npmplus_proxy_request_buffering"
},
"npmplus_proxy_response_buffering": {
"$ref": "#/properties/npmplus_proxy_response_buffering"
},
"npmplus_upstream_compression": {
"$ref": "#/properties/npmplus_upstream_compression"
},
"npmplus_fancyindex": {
"$ref": "#/properties/npmplus_fancyindex"
},
"npmplus_x_frame_options": {
"$ref": "#/properties/npmplus_x_frame_options"
},
"npmplus_auth_request": {
"$ref": "#/properties/npmplus_auth_request"
},
"advanced_config": {
"type": "string"
}
@@ -174,6 +257,11 @@
"hsts_subdomains": {
"$ref": "../common.json#/properties/hsts_subdomains"
},
"trust_forwarded_proto":{
"type": "boolean",
"description": "Trust the forwarded headers",
"example": false
},
"certificate": {
"oneOf": [
{

View File

@@ -16,6 +16,7 @@
"hsts_enabled",
"hsts_subdomains",
"http2_support",
"npmplus_http3_support",
"block_exploits",
"advanced_config",
"enabled",
@@ -81,6 +82,9 @@
"http2_support": {
"$ref": "../common.json#/properties/http2_support"
},
"npmplus_http3_support": {
"$ref": "../common.json#/properties/npmplus_http3_support"
},
"block_exploits": {
"$ref": "../common.json#/properties/block_exploits"
},

View File

@@ -1,8 +1,7 @@
{
"bearerAuth": {
"type": "http",
"scheme": "bearer",
"bearerFormat": "JWT",
"description": "JWT Bearer Token authentication"
"cookieAuth": {
"type": "apiKey",
"in": "cookie",
"name": "token"
}
}

View File

@@ -11,7 +11,8 @@
"forwarding_port",
"tcp_forwarding",
"udp_forwarding",
"proxy_protocol_forwarding",
"npmplus_proxy_protocol_forwarding",
"npmplus_proxy_tls",
"enabled",
"meta"
],
@@ -57,7 +58,11 @@
"type": "boolean",
"example": false
},
"proxy_protocol_forwarding": {
"npmplus_proxy_protocol_forwarding": {
"type": "boolean",
"example": false
},
"npmplus_proxy_tls": {
"type": "boolean",
"example": false
},

View File

@@ -0,0 +1,18 @@
{
"type": "object",
"description": "Token object",
"required": ["requires_2fa", "challenge_token"],
"additionalProperties": false,
"properties": {
"requires_2fa": {
"description": "Whether this token request requires two-factor authentication",
"example": true,
"type": "boolean"
},
"challenge_token": {
"description": "Challenge Token used in subsequent 2FA verification",
"example": "eyJhbGciOiJSUzUxMiIsInR5cCI6IkpXVCJ9.ey...xaHKYr3Kk6MvkUjcC4",
"type": "string"
}
}
}

View File

@@ -2,8 +2,7 @@
"type": "object",
"description": "Token object",
"required": [
"expires",
"token"
"expires"
],
"additionalProperties": false,
"properties": {
@@ -11,11 +10,6 @@
"description": "Token Expiry ISO Time String",
"example": "2025-02-04T20:40:46.340Z",
"type": "string"
},
"token": {
"description": "JWT Token",
"example": "eyJhbGciOiJSUzUxMiIsInR5cCI6IkpXVCJ9.ey...xaHKYr3Kk6MvkUjcC4",
"type": "string"
}
}
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"admin"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"admin"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"access_lists.view"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"access_lists.manage"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"access_lists.view"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"access_lists.manage"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"access_lists.manage"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"certificates.manage"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"certificates.manage"
]
}

View File

@@ -6,7 +6,7 @@
],
"security": [
{
"bearerAuth": [
"cookieAuth": [
"certificates.view"
]
}

Some files were not shown because too many files have changed in this diff Show More