Merge tag '2.11.0-RC3'
Some checks failed
Commit Message Check / Check Commit Message (push) Has been cancelled

This commit is contained in:
Andrey Antukh
2025-11-04 16:43:32 +01:00
745 changed files with 78479 additions and 39907 deletions

View File

@@ -326,24 +326,16 @@ jobs:
workflows: workflows:
penpot: penpot:
jobs: jobs:
- lint
- test-frontend: - test-frontend:
requires: requires:
- lint: success - lint: success
- test-library: - test-library:
requires: requires:
- test-frontend: success
- lint: success - lint: success
- test-components: - test-components:
requires: requires:
- test-frontend: success
- lint: success
- test-integration:
requires:
- test-frontend: success
- lint: success - lint: success
- test-backend: - test-backend:
@@ -354,4 +346,6 @@ workflows:
requires: requires:
- lint: success - lint: success
- lint
- test-integration
- test-render-wasm - test-render-wasm

View File

@@ -46,6 +46,7 @@ jobs:
mv penpot/backend bundle-backend mv penpot/backend bundle-backend
mv penpot/frontend bundle-frontend mv penpot/frontend bundle-frontend
mv penpot/exporter bundle-exporter mv penpot/exporter bundle-exporter
mv penpot/storybook bundle-storybook
popd popd
- name: Set up Docker Buildx - name: Set up Docker Buildx
@@ -99,3 +100,29 @@ jobs:
tags: ${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:${{ steps.vars.outputs.gh_ref }} tags: ${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:${{ steps.vars.outputs.gh_ref }}
cache-from: type=registry,ref=${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:buildcache cache-from: type=registry,ref=${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:buildcache
cache-to: type=registry,ref=${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:buildcache,mode=max cache-to: type=registry,ref=${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:buildcache,mode=max
- name: Build and push Storybook Docker image
uses: docker/build-push-action@v6
env:
DOCKER_IMAGE: 'storybook'
BUNDLE_PATH: './bundle-storybook'
with:
context: ./docker/images/
file: ./docker/images/Dockerfile.storybook
platforms: linux/amd64,linux/arm64
push: true
tags: ${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:${{ steps.vars.outputs.gh_ref }}
cache-from: type=registry,ref=${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:buildcache
cache-to: type=registry,ref=${{ secrets.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE }}:buildcache,mode=max
- name: Notify Mattermost
if: failure()
uses: mattermost/action-mattermost-notify@master
with:
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_WEBHOOK }}
MATTERMOST_CHANNEL: bot-alerts-cicd
TEXT: |
❌ 🐳 *[PENPOT] Error building penpot docker images.*
📄 Triggered from ref: `${{ steps.vars.outputs.gh_ref }}`
🔗 Run: https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }}
@infra

View File

@@ -26,7 +26,7 @@ jobs:
- name: Check Commit Type - name: Check Commit Type
uses: gsactions/commit-message-checker@v2 uses: gsactions/commit-message-checker@v2
with: with:
pattern: '^(Merge|Revert|:(lipstick|globe_with_meridians|wrench|books|arrow_up|arrow_down|zap|ambulance|construction|boom|fire|whale|bug|sparkles|paperclip|tada|recycle|rewind):)\s["A-Z].*[^.]$' pattern: '^(Merge|Revert|:(lipstick|globe_with_meridians|wrench|books|arrow_up|arrow_down|zap|ambulance|construction|boom|fire|whale|bug|sparkles|paperclip|tada|recycle|rewind|construction_worker):)\s["A-Z].*[^.]$'
flags: 'gm' flags: 'gm'
error: 'Commit should match CONTRIBUTING.md guideline' error: 'Commit should match CONTRIBUTING.md guideline'
checkAllCommitMessages: 'true' # optional: this checks all commits associated with a pull request checkAllCommitMessages: 'true' # optional: this checks all commits associated with a pull request

View File

@@ -1,17 +1,91 @@
# CHANGELOG # CHANGELOG
## 2.11.0 (Unreleased)
### :boom: Breaking changes & Deprecations
- Deprecated configuration variables with the prefix `PENPOT_ASSETS_*`, and will be
removed in future versions:
- The `PENPOT_ASSETS_STORAGE_BACKEND` becomes `PENPOT_OBJECTS_STORAGE_BACKEND` and its
values passes from (`assets-fs` or `assets-s3`) to (`fs` or `s3`)
- The `PENPOT_STORAGE_ASSETS_FS_DIRECTORY` becomes `PENPOT_OBJECTS_STORAGE_FS_DIRECTORY`
- The `PENPOT_STORAGE_ASSETS_S3_BUCKET` becomes `PENPOT_OBJECTS_STORAGE_S3_BUCKET`
- The `PENPOT_STORAGE_ASSETS_S3_REGION` becomes `PENPOT_OBJECTS_STORAGE_S3_REGION`
- The `PENPOT_STORAGE_ASSETS_S3_ENDPOINT` becomes `PENPOT_OBJECTS_STORAGE_S3_ENDPOINT`
- The `PENPOT_STORAGE_ASSETS_S3_IO_THREADS` replaced (see below)
- Add `PENPOT_NETTY_IO_THREADS` and `PENPOT_EXECUTOR_THREADS` variables to provide the
control over concurrency of the shared resources used by netty. Penpot uses the netty IO
threads for AWS S3 SDK and Redis/Valkey communication, and the EXEC threads to perform
out of HTTP serving threads tasks such that cache invalidation, S3 response completion,
configuration reloading and many other auxiliar tasks. By default they use a half number
if available cpus with a minumum of 2 for both executors. You should not touch that
variables unless you are know what you are doing.
- Replace the `PENPOT_STORAGE_ASSETS_S3_IO_THREADS` with a more general configuration
`PENPOT_NETTY_IO_THREADS` used to configure a shared netty resources across different
services which use netty internally (redis connection, S3 SDK client). This
configuration is not very commonly used so don't expected real impact on any user.
### :rocket: Epics and highlights
### :heart: Community contributions (Thank you!)
### :sparkles: New features & Enhancements
- New composite token: Typography [Taiga #10200](https://tree.taiga.io/project/penpot/us/10200)
- Show current Penpot version [Taiga #11603](https://tree.taiga.io/project/penpot/us/11603)
- Switch several variant copies at the same time [Taiga #11411](https://tree.taiga.io/project/penpot/us/11411)
- Invitations management improvements [Taiga #3479](https://tree.taiga.io/project/penpot/us/3479)
- Alternative ways of creating variants - Button Viewport [Taiga #11931](https://tree.taiga.io/project/penpot/us/11931)
- Reorder properties for a component [Taiga #10225](https://tree.taiga.io/project/penpot/us/10225)
- File Data storage layout refactor [Github #7345](https://github.com/penpot/penpot/pull/7345)
- Make several queries optimization on comment threads [Github #7506](https://github.com/penpot/penpot/pull/7506)
### :bug: Bugs fixed
- Fix selection problems when devtools open [Taiga #11950](https://tree.taiga.io/project/penpot/issue/11950)
- Fix long font names overlap [Taiga #11844](https://tree.taiga.io/project/penpot/issue/11844)
- Fix paste behavior according to the selected element [Taiga #11979](https://tree.taiga.io/project/penpot/issue/11979)
- Fix problem with export size [Github #7160](https://github.com/penpot/penpot/issues/7160)
- Fix multi level library dependencies [Taiga #12155](https://tree.taiga.io/project/penpot/issue/12155)
- Fix component context menu options order in assets tab [Taiga #11941](https://tree.taiga.io/project/penpot/issue/11941)
- Fix error updating library [Taiga #12218](https://tree.taiga.io/project/penpot/issue/12218)
- Fix restoring a variant in another file makes it overlap the existing variant [Taiga #12049](https://tree.taiga.io/project/penpot/issue/12049)
- Fix auto-width changes to fixed when switching variants [Taiga #12172](https://tree.taiga.io/project/penpot/issue/12172)
- Fix component number has no singular translation string [Taiga #12106](https://tree.taiga.io/project/penpot/issue/12106)
- Fix adding/removing identical text fills [Taiga #12287](https://tree.taiga.io/project/penpot/issue/12287)
- Fix scroll on the inspect tab [Taiga #12293](https://tree.taiga.io/project/penpot/issue/12293)
- Fix lock proportion tooltip [Taiga #12326](https://tree.taiga.io/project/penpot/issue/12326)
- Fix internal Error when selecting a set by name in the token theme editor [Taiga #12310](https://tree.taiga.io/project/penpot/issue/12310)
- Fix drag & drop functionality is swapping instead or reordering [Taiga #12254](https://tree.taiga.io/project/penpot/issue/12254)
- Fix variants not syncronizing tokens on switch [Taiga #12290](https://tree.taiga.io/project/penpot/issue/12290)
- Fix incorrect behavior of Alt + Drag for variants [Taiga #12309](https://tree.taiga.io/project/penpot/issue/12309)
- Fix text override is lost after switch [Taiga #12269](https://tree.taiga.io/project/penpot/issue/12269)
- Fix exporting a board crashing the app [Taiga #12384](https://tree.taiga.io/project/penpot/issue/12384)
- Fix nested variant in a component doesn't keep inherited overrides [Taiga #12299](https://tree.taiga.io/project/penpot/issue/12299)
- Fix selected colors not showing colors from children shapes in multiple selection [Taiga #12384](https://tree.taiga.io/project/penpot/issue/12385)
- Fix scrollbar issue in design tab [Taiga #12367](https://tree.taiga.io/project/penpot/issue/12367)
- Fix library update notificacions showing when they should not [Taiga #12397](https://tree.taiga.io/project/penpot/issue/12397)
- Fix remove flex button doesnt work within variant [Taiga #12314](https://tree.taiga.io/project/penpot/issue/12314)
- Fix an error translation [Taiga #12402](https://tree.taiga.io/project/penpot/issue/12402)
- Fix problem with certain text input in some editable labels (pages, components, tokens...) being in conflict with the drag/drop functionality [Taiga #12316](https://tree.taiga.io/project/penpot/issue/12316)
- Fix not controlled theme renaming [Taiga #12411](https://tree.taiga.io/project/penpot/issue/12411)
- Fix paste without selection sends the new element in the back [Taiga #12382](https://tree.taiga.io/project/penpot/issue/12382)
- Fix options button does not work for comments created in the lower part of the screen [Taiga #12422](https://tree.taiga.io/project/penpot/issue/12422)
- Fix problem when checking usage with removed teams [Taiga #12442](https://tree.taiga.io/project/penpot/issue/12442)
## 2.10.1 ## 2.10.1
### :sparkles: New features & Enhancements ### :sparkles: New features & Enhancements
- Improve workpace file loading [Github 7366](https://github.com/penpot/penpot/pull/7366) - Improve workpace file loading [Github 7366](https://github.com/penpot/penpot/pull/7366)
### :bug: Bugs fixed ### :bug: Bugs fixed
- Fix regression with text shapes creation with Plugins API [Taiga #12244](https://tree.taiga.io/project/penpot/issue/12244) - Fix regression with text shapes creation with Plugins API [Taiga #12244](https://tree.taiga.io/project/penpot/issue/12244)
## 2.10.0 ## 2.10.0
### :rocket: Epics and highlights ### :rocket: Epics and highlights
@@ -47,6 +121,7 @@
- Retrieve variants with nested components [Taiga #10277](https://tree.taiga.io/project/penpot/us/10277) - Retrieve variants with nested components [Taiga #10277](https://tree.taiga.io/project/penpot/us/10277)
- Create variants in bulk from existing components [Taiga #7926](https://tree.taiga.io/project/penpot/us/7926) - Create variants in bulk from existing components [Taiga #7926](https://tree.taiga.io/project/penpot/us/7926)
- Alternative ways of creating variants - Button Design Tab [Taiga #10316](https://tree.taiga.io/project/penpot/us/10316) - Alternative ways of creating variants - Button Design Tab [Taiga #10316](https://tree.taiga.io/project/penpot/us/10316)
- Fix problem with component swapping panel [Taiga #12175](https://tree.taiga.io/project/penpot/issue/12175)
### :bug: Bugs fixed ### :bug: Bugs fixed
@@ -60,7 +135,7 @@
- Fix issue where Alt + arrow keys shortcut interferes with letter-spacing when moving text layers [Taiga #11552](https://tree.taiga.io/project/penpot/issue/11771) - Fix issue where Alt + arrow keys shortcut interferes with letter-spacing when moving text layers [Taiga #11552](https://tree.taiga.io/project/penpot/issue/11771)
- Fix consistency issues on how font variants are visualized [Taiga #11499](https://tree.taiga.io/project/penpot/us/11499) - Fix consistency issues on how font variants are visualized [Taiga #11499](https://tree.taiga.io/project/penpot/us/11499)
- Fix parsing rx and ry SVG values for rect radius [Taiga #11861](https://tree.taiga.io/project/penpot/issue/11861) - Fix parsing rx and ry SVG values for rect radius [Taiga #11861](https://tree.taiga.io/project/penpot/issue/11861)
- Misleading affordance in saved versions [Taiga #11887](https://tree.taiga.io/project/penpot/issue/11887) - Fix misleading affordance in saved versions [Taiga #11887](https://tree.taiga.io/project/penpot/issue/11887)
- Fix pasting RTF text crashes penpot [Taiga #11717](https://tree.taiga.io/project/penpot/issue/11717) - Fix pasting RTF text crashes penpot [Taiga #11717](https://tree.taiga.io/project/penpot/issue/11717)
- Fix navigation arrows in Libraries & Templates carousel [Taiga #10609](https://tree.taiga.io/project/penpot/issue/10609) - Fix navigation arrows in Libraries & Templates carousel [Taiga #10609](https://tree.taiga.io/project/penpot/issue/10609)
- Fix applying tokens with zero value to size [Taiga #11618](https://tree.taiga.io/project/penpot/issue/11618) - Fix applying tokens with zero value to size [Taiga #11618](https://tree.taiga.io/project/penpot/issue/11618)
@@ -107,7 +182,6 @@
- Add info to apply-token event [Taiga #11710](https://tree.taiga.io/project/penpot/task/11710) - Add info to apply-token event [Taiga #11710](https://tree.taiga.io/project/penpot/task/11710)
- Fix double click on set name input [Taiga #11747](https://tree.taiga.io/project/penpot/issue/11747) - Fix double click on set name input [Taiga #11747](https://tree.taiga.io/project/penpot/issue/11747)
### :bug: Bugs fixed ### :bug: Bugs fixed
- Copying font size does not copy the unit [Taiga #11143](https://tree.taiga.io/project/penpot/issue/11143) - Copying font size does not copy the unit [Taiga #11143](https://tree.taiga.io/project/penpot/issue/11143)
@@ -155,7 +229,7 @@
**Penpot Library** **Penpot Library**
The initial prototype is completly reworked for provide a more consistent API The initial prototype is completly reworked to provide a more consistent API
and to have proper validation and params decoding. All the details can be found and to have proper validation and params decoding. All the details can be found
on [its own changelog](library/CHANGES.md) on [its own changelog](library/CHANGES.md)

View File

@@ -77,17 +77,14 @@ Provide your team or organization with a completely owned collaborative design t
### Integrations ### ### Integrations ###
Penpot offers integration into the development toolchain, thanks to its support for webhooks and an API accessible through access tokens. Penpot offers integration into the development toolchain, thanks to its support for webhooks and an API accessible through access tokens.
### Whats great for design ### ### Building Design Systems: design tokens, components and variants ###
With Penpot you can design libraries to share and reuse; turn design elements into components and tokens to allow reusability and scalability; and build realistic user flows and interactions. Penpot brings design systems to code-minded teams: a single source of truth with native Design Tokens, Components, and Variants for scalable, reusable, and consistent UI across projects and platforms.
### Design Tokens ###
With Penpots standardized [design tokens](https://penpot.dev/collaboration/design-tokens) format, you can easily reuse and sync tokens across different platforms, workflows, and disciplines.
<br /> <br />
<p align="center"> <p align="center">
<img src="https://img.plasmic.app/img-optimizer/v1/img?src=https%3A%2F%2Fimg.plasmic.app%2Fimg-optimizer%2Fv1%2Fimg%2F9dd677c36afb477e9666ccd1d3f009ad.png" alt="Open Source" style="width: 65%;"> <img src="https://github.com/user-attachments/assets/cce75ad6-f783-473f-8803-da9eb8255fef">
</p> </p>
<br /> <br />

View File

@@ -6,7 +6,7 @@
org.clojure/clojure {:mvn/version "1.12.2"} org.clojure/clojure {:mvn/version "1.12.2"}
org.clojure/tools.namespace {:mvn/version "1.5.0"} org.clojure/tools.namespace {:mvn/version "1.5.0"}
com.github.luben/zstd-jni {:mvn/version "1.5.7-3"} com.github.luben/zstd-jni {:mvn/version "1.5.7-4"}
io.prometheus/simpleclient {:mvn/version "0.16.0"} io.prometheus/simpleclient {:mvn/version "0.16.0"}
io.prometheus/simpleclient_hotspot {:mvn/version "0.16.0"} io.prometheus/simpleclient_hotspot {:mvn/version "0.16.0"}
@@ -17,7 +17,7 @@
io.prometheus/simpleclient_httpserver {:mvn/version "0.16.0"} io.prometheus/simpleclient_httpserver {:mvn/version "0.16.0"}
io.lettuce/lettuce-core {:mvn/version "6.7.0.RELEASE"} io.lettuce/lettuce-core {:mvn/version "6.8.1.RELEASE"}
;; Minimal dependencies required by lettuce, we need to include them ;; Minimal dependencies required by lettuce, we need to include them
;; explicitly because clojure dependency management does not support ;; explicitly because clojure dependency management does not support
;; yet the BOM format. ;; yet the BOM format.
@@ -28,29 +28,30 @@
com.google.guava/guava {:mvn/version "33.4.8-jre"} com.google.guava/guava {:mvn/version "33.4.8-jre"}
funcool/yetti funcool/yetti
{:git/tag "v11.4" {:git/tag "v11.6"
:git/sha "ce50d42" :git/sha "94dc017"
:git/url "https://github.com/funcool/yetti.git" :git/url "https://github.com/funcool/yetti.git"
:exclusions [org.slf4j/slf4j-api]} :exclusions [org.slf4j/slf4j-api]}
com.github.seancorfield/next.jdbc com.github.seancorfield/next.jdbc
{:mvn/version "1.3.1002"} {:mvn/version "1.3.1070"}
metosin/reitit-core {:mvn/version "0.9.1"} metosin/reitit-core {:mvn/version "0.9.1"}
nrepl/nrepl {:mvn/version "1.3.1"} nrepl/nrepl {:mvn/version "1.4.0"}
org.postgresql/postgresql {:mvn/version "42.7.7"} org.postgresql/postgresql {:mvn/version "42.7.7"}
org.xerial/sqlite-jdbc {:mvn/version "3.49.1.0"} org.xerial/sqlite-jdbc {:mvn/version "3.50.3.0"}
com.zaxxer/HikariCP {:mvn/version "6.3.0"} com.zaxxer/HikariCP {:mvn/version "7.0.2"}
io.whitfin/siphash {:mvn/version "2.0.0"} io.whitfin/siphash {:mvn/version "2.0.0"}
buddy/buddy-hashers {:mvn/version "2.0.167"} buddy/buddy-hashers {:mvn/version "2.0.167"}
buddy/buddy-sign {:mvn/version "3.6.1-359"} buddy/buddy-sign {:mvn/version "3.6.1-359"}
com.github.ben-manes.caffeine/caffeine {:mvn/version "3.2.0"} com.github.ben-manes.caffeine/caffeine {:mvn/version "3.2.2"}
org.jsoup/jsoup {:mvn/version "1.20.1"} org.jsoup/jsoup {:mvn/version "1.21.2"}
org.im4java/im4java org.im4java/im4java
{:git/tag "1.4.0-penpot-2" {:git/tag "1.4.0-penpot-2"
:git/sha "e2b3e16" :git/sha "e2b3e16"
@@ -60,12 +61,12 @@
org.clojars.pntblnk/clj-ldap {:mvn/version "0.0.17"} org.clojars.pntblnk/clj-ldap {:mvn/version "0.0.17"}
dawran6/emoji {:mvn/version "0.1.5"} dawran6/emoji {:mvn/version "0.2.0"}
markdown-clj/markdown-clj {:mvn/version "1.12.3"} markdown-clj/markdown-clj {:mvn/version "1.12.4"}
;; Pretty Print specs ;; Pretty Print specs
pretty-spec/pretty-spec {:mvn/version "0.1.4"} pretty-spec/pretty-spec {:mvn/version "0.1.4"}
software.amazon.awssdk/s3 {:mvn/version "2.33.8"}} software.amazon.awssdk/s3 {:mvn/version "2.33.10"}}
:paths ["src" "resources" "target/classes"] :paths ["src" "resources" "target/classes"]
:aliases :aliases
@@ -80,12 +81,14 @@
:build :build
{:extra-deps {:extra-deps
{io.github.clojure/tools.build {:git/tag "v0.10.9" :git/sha "e405aac"}} {io.github.clojure/tools.build {:mvn/version "0.10.10"}}
:ns-default build} :ns-default build}
:test :test
{:main-opts ["-m" "kaocha.runner"] {:main-opts ["-m" "kaocha.runner"]
:jvm-opts ["-Dlog4j2.configurationFile=log4j2-devenv-repl.xml"] :jvm-opts ["-Dlog4j2.configurationFile=log4j2-devenv-repl.xml"
"--sun-misc-unsafe-memory-access=allow"
"--enable-native-access=ALL-UNNAMED"]
:extra-deps {lambdaisland/kaocha {:mvn/version "1.91.1392"}}} :extra-deps {lambdaisland/kaocha {:mvn/version "1.91.1392"}}}
:outdated :outdated

View File

@@ -30,8 +30,8 @@
[app.config :as cf] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.main :as main] [app.main :as main]
[app.srepl.helpers :as srepl.helpers] [app.srepl.helpers :as h]
[app.srepl.main :as srepl] [app.srepl.main :refer :all]
[app.util.blob :as blob] [app.util.blob :as blob]
[clj-async-profiler.core :as prof] [clj-async-profiler.core :as prof]
[clojure.contrib.humanize :as hum] [clojure.contrib.humanize :as hum]

View File

@@ -1 +1 @@
Invitation to join {{team}} {{invited-by|abbreviate:25}} has invited you to join the team “{{ team|abbreviate:25 }}

View File

@@ -1,6 +1,9 @@
[{:id "tokens-starter-kit" [{:id "tokens-starter-kit"
:name "Design tokens starter kit" :name "Design tokens starter kit"
:file-uri "https://github.com/penpot/penpot-files/raw/refs/heads/main/Tokens%20starter%20kit.penpot"}, :file-uri "https://github.com/penpot/penpot-files/raw/refs/heads/main/Tokens%20starter%20kit.penpot"}
{:id "penpot-design-system"
:name "Penpot Design System | Pencil"
:file-uri "https://github.com/penpot/penpot-files/raw/refs/heads/main/penpot-app.penpot"}
{:id "wireframing-kit" {:id "wireframing-kit"
:name "Wireframe library" :name "Wireframe library"
:file-uri "https://github.com/penpot/penpot-files/raw/refs/heads/main/Wireframing%20kit%20v1.1.penpot"} :file-uri "https://github.com/penpot/penpot-files/raw/refs/heads/main/Wireframing%20kit%20v1.1.penpot"}
@@ -10,9 +13,6 @@
{:id "plants-app" {:id "plants-app"
:name "UI mockup example" :name "UI mockup example"
:file-uri "https://github.com/penpot/penpot-files/raw/main/Plants-app.penpot"} :file-uri "https://github.com/penpot/penpot-files/raw/main/Plants-app.penpot"}
{:id "penpot-design-system"
:name "Design system example"
:file-uri "https://github.com/penpot/penpot-files/raw/refs/heads/main/Penpot%20-%20Design%20System%20v2.1.penpot"}
{:id "tutorial-for-beginners" {:id "tutorial-for-beginners"
:name "Tutorial for beginners" :name "Tutorial for beginners"
:file-uri "https://github.com/penpot/penpot-files/raw/main/tutorial-for-beginners.penpot"} :file-uri "https://github.com/penpot/penpot-files/raw/main/tutorial-for-beginners.penpot"}

View File

@@ -45,7 +45,41 @@ Debug Main Page
</form> </form>
</fieldset> </fieldset>
<fieldset>
<legend>VIRTUAL CLOCK</legend>
<desc>
<p>
CURRENT CLOCK: <b>{{current-clock}}</b>
<br />
CURRENT OFFSET: <b>{{current-offset}}</b>
<br />
CURRENT TIME: <b>{{current-time}}</b>
</p>
<p>Examples: 3h, -7h, 24h (allowed suffixes: h, s)</p>
</desc>
<form method="post" action="/dbg/actions/set-virtual-clock">
<div class="row">
<input type="text" name="offset" placeholder="3h" value="" />
</div>
<div class="row">
<label for="force-verify">Are you sure?</label>
<input id="force-verify" type="checkbox" name="force" />
<br />
<small>
This is a just a security double check for prevent non intentional submits.
</small>
</div>
<div class="row">
<input type="submit" name="submit" value="Submit" />
<input type="submit" name="reset" value="Reset" />
</div>
</form>
</fieldset>
</section> </section>

81
backend/scripts/_env Normal file
View File

@@ -0,0 +1,81 @@
#!/usr/bin/env bash
export PENPOT_MANAGEMENT_API_SHARED_KEY=super-secret-management-api-key
export PENPOT_SECRET_KEY=super-secret-devenv-key
export PENPOT_HOST=devenv
export PENPOT_FLAGS="\
$PENPOT_FLAGS \
enable-login-with-ldap \
enable-login-with-password
enable-login-with-oidc \
enable-login-with-google \
enable-login-with-github \
enable-login-with-gitlab \
enable-backend-worker \
enable-backend-asserts \
disable-feature-fdata-pointer-map \
enable-feature-fdata-objects-map \
enable-audit-log \
enable-transit-readable-response \
enable-demo-users \
disable-secure-session-cookies \
enable-smtp \
enable-prepl-server \
enable-urepl-server \
enable-rpc-climit \
enable-rpc-rlimit \
enable-quotes \
enable-soft-rpc-rlimit \
enable-auto-file-snapshot \
enable-webhooks \
enable-access-tokens \
disable-tiered-file-data-storage \
enable-file-validation \
enable-file-schema-validation \
enable-redis-cache \
enable-subscriptions";
# Default deletion delay for devenv
export PENPOT_DELETION_DELAY="24h"
# Setup default upload media file size to 100MiB
export PENPOT_MEDIA_MAX_FILE_SIZE=104857600
# Setup default multipart upload size to 300MiB
export PENPOT_HTTP_SERVER_MAX_MULTIPART_BODY_SIZE=314572800
export AWS_ACCESS_KEY_ID=penpot-devenv
export AWS_SECRET_ACCESS_KEY=penpot-devenv
export PENPOT_OBJECTS_STORAGE_BACKEND=s3
export PENPOT_OBJECTS_STORAGE_S3_ENDPOINT=http://minio:9000
export PENPOT_OBJECTS_STORAGE_S3_BUCKET=penpot
export JAVA_OPTS="\
-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager \
-Djdk.attach.allowAttachSelf \
-Dlog4j2.configurationFile=log4j2-devenv.xml \
-Djdk.tracePinnedThreads=full \
-Dim4java.useV7=true \
-XX:+UnlockExperimentalVMOptions \
-XX:+UseShenandoahGC \
-XX:+UseCompactObjectHeaders \
-XX:ShenandoahGCMode=generational \
-XX:-OmitStackTraceInFastThrow \
--sun-misc-unsafe-memory-access=allow \
--enable-preview \
--enable-native-access=ALL-UNNAMED";
function setup_minio() {
# Initialize MINIO config
mc alias set penpot-s3/ http://minio:9000 minioadmin minioadmin -q
mc admin user add penpot-s3 penpot-devenv penpot-devenv -q
mc admin user info penpot-s3 penpot-devenv |grep -F -q "readwrite"
if [ "$?" = "1" ]; then
mc admin policy attach penpot-s3 readwrite --user=penpot-devenv -q
fi
mc mb penpot-s3/penpot -p -q
}

View File

@@ -1,112 +1,13 @@
#!/usr/bin/env bash #!/usr/bin/env bash
export PENPOT_SECRET_KEY=super-secret-devenv-key SCRIPT_DIR=$(dirname $0);
export PENPOT_HOST=devenv source $SCRIPT_DIR/_env;
export PENPOT_FLAGS="\
$PENPOT_FLAGS \
enable-login-with-ldap \
enable-login-with-password
enable-login-with-oidc \
enable-login-with-google \
enable-login-with-github \
enable-login-with-gitlab \
enable-backend-worker \
enable-backend-asserts \
disable-feature-fdata-pointer-map \
enable-feature-fdata-objects-map \
enable-audit-log \
enable-transit-readable-response \
enable-demo-users \
disable-secure-session-cookies \
enable-smtp \
enable-prepl-server \
enable-urepl-server \
enable-rpc-climit \
enable-rpc-rlimit \
enable-quotes \
enable-soft-rpc-rlimit \
enable-auto-file-snapshot \
enable-webhooks \
enable-access-tokens \
disable-tiered-file-data-storage \
enable-file-validation \
enable-file-schema-validation \
enable-subscriptions \
disable-subscriptions-old";
# Default deletion delay for devenv
export PENPOT_DELETION_DELAY="24h"
# Setup default upload media file size to 100MiB
export PENPOT_MEDIA_MAX_FILE_SIZE=104857600
# Setup default multipart upload size to 300MiB
export PENPOT_HTTP_SERVER_MAX_MULTIPART_BODY_SIZE=314572800
# export PENPOT_DATABASE_URI="postgresql://172.17.0.1:5432/penpot"
# export PENPOT_DATABASE_USERNAME="penpot"
# export PENPOT_DATABASE_PASSWORD="penpot"
# export PENPOT_DATABASE_READONLY=true
# export PENPOT_DATABASE_URI="postgresql://172.17.0.1:5432/penpot_pre"
# export PENPOT_DATABASE_USERNAME="penpot_pre"
# export PENPOT_DATABASE_PASSWORD="penpot_pre"
# export PENPOT_LOGGERS_LOKI_URI="http://172.17.0.1:3100/loki/api/v1/push"
# export PENPOT_AUDIT_LOG_ARCHIVE_URI="http://localhost:6070/api/audit"
# Initialize MINIO config # Initialize MINIO config
mc alias set penpot-s3/ http://minio:9000 minioadmin minioadmin -q setup_minio;
mc admin user add penpot-s3 penpot-devenv penpot-devenv -q
mc admin user info penpot-s3 penpot-devenv |grep -F -q "readwrite"
if [ "$?" = "1" ]; then
mc admin policy attach penpot-s3 readwrite --user=penpot-devenv -q
fi
mc mb penpot-s3/penpot -p -q
export AWS_ACCESS_KEY_ID=penpot-devenv
export AWS_SECRET_ACCESS_KEY=penpot-devenv
export PENPOT_OBJECTS_STORAGE_BACKEND=s3
export PENPOT_OBJECTS_STORAGE_S3_ENDPOINT=http://minio:9000
export PENPOT_OBJECTS_STORAGE_S3_BUCKET=penpot
export PENPOT_OBJECTS_STORAGE_FS_DIRECTORY="assets"
export JAVA_OPTS="\
-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager \
-Djdk.attach.allowAttachSelf \
-Dlog4j2.configurationFile=log4j2-devenv-repl.xml \
-Djdk.tracePinnedThreads=full \
-Dim4java.useV7=true \
-XX:+EnableDynamicAgentLoading \
-XX:-OmitStackTraceInFastThrow \
-XX:+UnlockDiagnosticVMOptions \
-XX:+DebugNonSafepoints \
--sun-misc-unsafe-memory-access=allow \
--enable-preview \
--enable-native-access=ALL-UNNAMED";
export JAVA_OPTS="$JAVA_OPTS -Dlog4j2.configurationFile=log4j2-devenv-repl.xml"
export OPTIONS="-A:jmx-remote -A:dev" export OPTIONS="-A:jmx-remote -A:dev"
# Setup HEAP
# export OPTIONS="$OPTIONS -J-Xms50m -J-Xmx1024m"
# export OPTIONS="$OPTIONS -J-Xms1100m -J-Xmx1100m -J-XX:+AlwaysPreTouch"
# Increase virtual thread pool size
# export OPTIONS="$OPTIONS -J-Djdk.virtualThreadScheduler.parallelism=16"
# Disable C2 Compiler
# export OPTIONS="$OPTIONS -J-XX:TieredStopAtLevel=1"
# Disable all compilers
# export OPTIONS="$OPTIONS -J-Xint"
# Setup GC
# export OPTIONS="$OPTIONS -J-XX:+UseG1GC"
# Setup GC
# export OPTIONS="$OPTIONS -J-XX:+UseZGC"
export OPTIONS_EVAL="nil" export OPTIONS_EVAL="nil"
# export OPTIONS_EVAL="(set! *warn-on-reflection* true)" # export OPTIONS_EVAL="(set! *warn-on-reflection* true)"

View File

@@ -1,48 +0,0 @@
#!/usr/bin/env bash
source /home/penpot/environ
export PENPOT_FLAGS="$PENPOT_FLAGS disable-backend-worker"
export OPTIONS="
-A:jmx-remote -A:dev \
-J-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager \
-J-Djdk.attach.allowAttachSelf \
-J-Dlog4j2.configurationFile=log4j2-experiments.xml \
-J-XX:-OmitStackTraceInFastThrow \
-J-XX:+UnlockDiagnosticVMOptions \
-J-XX:+DebugNonSafepoints \
-J-Djdk.tracePinnedThreads=full \
-J-XX:+UseTransparentHugePages \
-J-XX:ReservedCodeCacheSize=1g \
-J-Dpolyglot.engine.WarnInterpreterOnly=false \
-J--enable-preview";
# Setup HEAP
export OPTIONS="$OPTIONS -J-Xms320g -J-Xmx320g -J-XX:+AlwaysPreTouch"
export PENPOT_HTTP_SERVER_IO_THREADS=2
export PENPOT_HTTP_SERVER_WORKER_THREADS=2
# Increase virtual thread pool size
# export OPTIONS="$OPTIONS -J-Djdk.virtualThreadScheduler.parallelism=16"
# Disable C2 Compiler
# export OPTIONS="$OPTIONS -J-XX:TieredStopAtLevel=1"
# Disable all compilers
# export OPTIONS="$OPTIONS -J-Xint"
# Setup GC
export OPTIONS="$OPTIONS -J-XX:+UseG1GC -J-Xlog:gc:logs/gc.log"
# Setup GC
#export OPTIONS="$OPTIONS -J-XX:+UseZGC -J-XX:+ZGenerational -J-Xlog:gc:logs/gc.log"
# Enable ImageMagick v7.x support
# export OPTIONS="-J-Dim4java.useV7=true $OPTIONS";
export OPTIONS_EVAL="nil"
# export OPTIONS_EVAL="(set! *warn-on-reflection* true)"
set -ex
exec clojure $OPTIONS -M -e "$OPTIONS_EVAL" -m rebel-readline.main

View File

@@ -1,44 +1,13 @@
#!/usr/bin/env bash #!/usr/bin/env bash
export PENPOT_SECRET_KEY=super-secret-devenv-key SCRIPT_DIR=$(dirname $0);
export PENPOT_HOST=devenv
export PENPOT_FLAGS="\
$PENPOT_FLAGS \
enable-backend-asserts \
enable-feature-fdata-pointer-map \
enable-feature-fdata-objects-map \
enable-file-snapshot \
enable-tiered-file-data-storage";
export JAVA_OPTS=" source $SCRIPT_DIR/_env;
-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager \ export OPTIONS="-A:dev"
-Djdk.attach.allowAttachSelf \
-Dlog4j2.configurationFile=log4j2-devenv.xml \
-XX:+EnableDynamicAgentLoading \
-XX:-OmitStackTraceInFastThrow \
-XX:+UnlockDiagnosticVMOptions \
-XX:+DebugNonSafepoints";
export CLOJURE_OPTIONS="-A:dev"
# Default deletion delay for devenv
export PENPOT_DELETION_DELAY="24h"
# Setup default upload media file size to 100MiB
export PENPOT_MEDIA_MAX_FILE_SIZE=104857600
# Setup default multipart upload size to 300MiB
export PENPOT_HTTP_SERVER_MAX_MULTIPART_BODY_SIZE=314572800
export AWS_ACCESS_KEY_ID=penpot-devenv
export AWS_SECRET_ACCESS_KEY=penpot-devenv
export PENPOT_OBJECTS_STORAGE_BACKEND=s3
export PENPOT_OBJECTS_STORAGE_S3_ENDPOINT=http://minio:9000
export PENPOT_OBJECTS_STORAGE_S3_BUCKET=penpot
entrypoint=${1:-app.main}; entrypoint=${1:-app.main};
shift 1; shift 1;
set -ex set -ex
clojure $CLOJURE_OPTIONS -A:dev -M -m $entrypoint "$@"; exec clojure $OPTIONS -A:dev -M -m $entrypoint "$@";

View File

@@ -1,70 +1,11 @@
#!/usr/bin/env bash #!/usr/bin/env bash
export PENPOT_SECRET_KEY=super-secret-devenv-key SCRIPT_DIR=$(dirname $0);
export PENPOT_HOST=devenv source $SCRIPT_DIR/_env;
export PENPOT_FLAGS="\
$PENPOT_FLAGS \
enable-prepl-server \
enable-urepl-server \
enable-nrepl-server \
enable-webhooks \
enable-backend-asserts \
enable-audit-log \
enable-login-with-ldap \
enable-transit-readable-response \
enable-demo-users \
disable-feature-fdata-pointer-map \
enable-feature-fdata-objects-map \
disable-secure-session-cookies \
enable-rpc-climit \
enable-smtp \
enable-quotes \
enable-file-snapshot \
enable-access-tokens \
disable-tiered-file-data-storage \
enable-file-validation \
enable-file-schema-validation \
enable-subscriptions \
disable-subscriptions-old";
# Default deletion delay for devenv
export PENPOT_DELETION_DELAY="24h"
# Setup default upload media file size to 100MiB
export PENPOT_MEDIA_MAX_FILE_SIZE=104857600
# Setup default multipart upload size to 300MiB
export PENPOT_HTTP_SERVER_MAX_MULTIPART_BODY_SIZE=314572800
# Initialize MINIO config # Initialize MINIO config
mc alias set penpot-s3/ http://minio:9000 minioadmin minioadmin -q setup_minio;
mc admin user add penpot-s3 penpot-devenv penpot-devenv -q
mc admin user info penpot-s3 penpot-devenv |grep -F -q "readwrite"
if [ "$?" = "1" ]; then
mc admin policy attach penpot-s3 readwrite --user=penpot-devenv -q
fi
mc mb penpot-s3/penpot -p -q
export AWS_ACCESS_KEY_ID=penpot-devenv
export AWS_SECRET_ACCESS_KEY=penpot-devenv
export PENPOT_OBJECTS_STORAGE_BACKEND=s3
export PENPOT_OBJECTS_STORAGE_S3_ENDPOINT=http://minio:9000
export PENPOT_OBJECTS_STORAGE_S3_BUCKET=penpot
entrypoint=${1:-app.main};
export JAVA_OPTS="\
-Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager \
-Djdk.attach.allowAttachSelf \
-Dlog4j2.configurationFile=log4j2-devenv.xml \
-Djdk.tracePinnedThreads=full \
-Dim4java.useV7=true \
-XX:-OmitStackTraceInFastThrow \
--sun-misc-unsafe-memory-access=allow \
--enable-preview \
--enable-native-access=ALL-UNNAMED";
export OPTIONS="-A:jmx-remote -A:dev"
shift 1;
set -ex set -ex
clojure $OPTIONS -M -m $entrypoint; exec clojure -A:jmx-remote -A:dev -M -m app.main "$@";

View File

@@ -434,10 +434,10 @@
(sm/validator schema:info)) (sm/validator schema:info))
(defn- get-info (defn- get-info
[{:keys [::provider ::setup/props] :as cfg} {:keys [params] :as request}] [{:keys [::provider] :as cfg} {:keys [params] :as request}]
(let [state (get params :state) (let [state (get params :state)
code (get params :code) code (get params :code)
state (tokens/verify props {:token state :iss :oauth}) state (tokens/verify cfg {:token state :iss :oauth})
tdata (fetch-access-token cfg code) tdata (fetch-access-token cfg code)
info (case (cf/get :oidc-user-info-source) info (case (cf/get :oidc-user-info-source)
:token (get-user-info cfg tdata) :token (get-user-info cfg tdata)
@@ -516,7 +516,7 @@
:iss :prepared-register :iss :prepared-register
:exp (ct/in-future {:hours 48})) :exp (ct/in-future {:hours 48}))
params {:token (tokens/generate (::setup/props cfg) info) params {:token (tokens/generate cfg info)
:provider (:provider (:path-params request)) :provider (:provider (:path-params request))
:fullname (:fullname info)} :fullname (:fullname info)}
params (d/without-nils params)] params (d/without-nils params)]
@@ -569,7 +569,7 @@
:else :else
(let [sxf (session/create-fn cfg (:id profile)) (let [sxf (session/create-fn cfg (:id profile))
token (or (:invitation-token info) token (or (:invitation-token info)
(tokens/generate (::setup/props cfg) (tokens/generate cfg
{:iss :auth {:iss :auth
:exp (ct/in-future "15m") :exp (ct/in-future "15m")
:profile-id (:id profile)})) :profile-id (:id profile)}))
@@ -620,8 +620,7 @@
:external-session-id esid :external-session-id esid
:props props :props props
:exp (ct/in-future "4h")} :exp (ct/in-future "4h")}
state (tokens/generate (::setup/props cfg) state (tokens/generate cfg (d/without-nils params))
(d/without-nils params))
uri (build-auth-uri cfg state)] uri (build-auth-uri cfg state)]
{::yres/status 200 {::yres/status 200
::yres/body {:redirect-uri uri}})) ::yres/body {:redirect-uri uri}}))

View File

@@ -34,8 +34,7 @@
[clojure.set :as set] [clojure.set :as set]
[cuerdas.core :as str] [cuerdas.core :as str]
[datoteka.fs :as fs] [datoteka.fs :as fs]
[datoteka.io :as io] [datoteka.io :as io]))
[promesa.exec :as px]))
(set! *warn-on-reflection* true) (set! *warn-on-reflection* true)
@@ -142,13 +141,11 @@
([index coll attr] ([index coll attr]
(reduce #(index-object %1 %2 attr) index coll))) (reduce #(index-object %1 %2 attr) index coll)))
(defn decode-row (defn- decode-row-features
[{:keys [data changes features] :as row}] [{:keys [features] :as row}]
(when row (when row
(cond-> row (cond-> row
features (assoc :features (db/decode-pgarray features #{})) (db/pgarray? features) (assoc :features (db/decode-pgarray features #{})))))
changes (assoc :changes (blob/decode changes))
data (assoc :data (blob/decode data)))))
(def sql:get-minimal-file (def sql:get-minimal-file
"SELECT f.id, "SELECT f.id,
@@ -162,23 +159,158 @@
[cfg id & {:as opts}] [cfg id & {:as opts}]
(db/get-with-sql cfg [sql:get-minimal-file id] opts)) (db/get-with-sql cfg [sql:get-minimal-file id] opts))
(defn decode-file (def sql:files-with-data
"A general purpose file decoding function that resolves all external "SELECT f.id,
pointers, run migrations and return plain vanilla file map" f.project_id,
[cfg {:keys [id] :as file} & {:keys [migrate?] :or {migrate? true}}] f.created_at,
(binding [pmap/*load-fn* (partial fdata/load-pointer cfg id)] f.modified_at,
(let [file (->> file f.deleted_at,
f.name,
f.is_shared,
f.has_media_trimmed,
f.revn,
f.data AS legacy_data,
f.ignore_sync_until,
f.comment_thread_seqn,
f.features,
f.version,
f.vern,
p.team_id,
coalesce(fd.backend, 'legacy-db') AS backend,
fd.metadata AS metadata,
fd.data AS data
FROM file AS f
LEFT JOIN file_data AS fd ON (fd.file_id = f.id AND fd.id = f.id)
INNER JOIN project AS p ON (p.id = f.project_id)")
(def sql:get-file
(str sql:files-with-data " WHERE f.id = ?"))
(def sql:get-file-without-data
(str "WITH files AS (" sql:files-with-data ")"
"SELECT f.id,
f.project_id,
f.created_at,
f.modified_at,
f.deleted_at,
f.name,
f.is_shared,
f.has_media_trimmed,
f.revn,
f.ignore_sync_until,
f.comment_thread_seqn,
f.features,
f.version,
f.vern,
f.team_id
FROM files AS f
WHERE f.id = ?"))
(defn- migrate-file
[{:keys [::db/conn] :as cfg} {:keys [read-only?]} {:keys [id] :as file}]
(binding [pmap/*load-fn* (partial fdata/load-pointer cfg id)
pmap/*tracked* (pmap/create-tracked)]
(let [libs (delay (get-resolved-file-libraries cfg file))
;; For avoid unnecesary overhead of creating multiple
;; pointers and handly internally with objects map in their
;; worst case (when probably all shapes and all pointers
;; will be readed in any case), we just realize/resolve them
;; before applying the migration to the file.
file (-> (fdata/realize cfg file)
(fmg/migrate-file libs))]
(if (or read-only? (db/read-only? conn))
file
(do ;; When file is migrated, we break the rule of no
;; perform mutations on get operations and update the
;; file with all migrations applied
(update-file! cfg file)
(fmigr/resolve-applied-migrations cfg file))))))
(defn- get-file*
[{:keys [::db/conn] :as cfg} id
{:keys [migrate?
realize?
decode?
skip-locked?
include-deleted?
load-data?
throw-if-not-exists?
lock-for-update?
lock-for-share?]
:or {lock-for-update? false
lock-for-share? false
load-data? true
migrate? true
decode? true
include-deleted? false
throw-if-not-exists? true
realize? false}
:as options}]
(assert (db/connection? conn) "expected cfg with valid connection")
(when (and (not load-data?)
(or lock-for-share? lock-for-share? skip-locked?))
(throw (IllegalArgumentException. "locking is incompatible when `load-data?` is false")))
(let [sql
(if load-data?
sql:get-file
sql:get-file-without-data)
sql
(cond
lock-for-update?
(str sql " FOR UPDATE of f")
lock-for-share?
(str sql " FOR SHARE of f")
:else
sql)
sql
(if skip-locked?
(str sql " SKIP LOCKED")
sql)
file
(db/get-with-sql conn [sql id]
{::db/throw-if-not-exists false
::db/remove-deleted (not include-deleted?)})
file
(-> file
(d/update-when :features db/decode-pgarray #{})
(d/update-when :metadata fdata/decode-metadata))]
(if file
(if load-data?
(let [file
(->> file
(fmigr/resolve-applied-migrations cfg) (fmigr/resolve-applied-migrations cfg)
(fdata/resolve-file-data cfg)) (fdata/resolve-file-data cfg))
libs (delay (get-resolved-file-libraries cfg file))]
(-> file will-migrate?
(update :features db/decode-pgarray #{}) (and migrate? (fmg/need-migration? file))]
(update :data blob/decode)
(update :data fdata/process-pointers deref) (if decode?
(update :data fdata/process-objects (partial into {})) (cond->> (fdata/decode-file-data cfg file)
(update :data assoc :id id) (and realize? (not will-migrate?))
(cond-> migrate? (fmg/migrate-file libs)))))) (fdata/realize cfg)
will-migrate?
(migrate-file cfg options))
file))
file)
(when-not (or skip-locked? (not throw-if-not-exists?))
(ex/raise :type :not-found
:code :object-not-found
:hint "database object not found"
:table :file
:file-id id)))))
(defn get-file (defn get-file
"Get file, resolve all features and apply migrations. "Get file, resolve all features and apply migrations.
@@ -187,10 +319,7 @@
operations on file, because it removes the ovehead of lazy fetching operations on file, because it removes the ovehead of lazy fetching
and decoding." and decoding."
[cfg file-id & {:as opts}] [cfg file-id & {:as opts}]
(db/run! cfg (fn [{:keys [::db/conn] :as cfg}] (db/run! cfg get-file* file-id opts))
(when-let [row (db/get* conn :file {:id file-id}
(assoc opts ::db/remove-deleted false))]
(decode-file cfg row opts)))))
(defn clean-file-features (defn clean-file-features
[file] [file]
@@ -214,12 +343,12 @@
(let [conn (db/get-connection cfg) (let [conn (db/get-connection cfg)
ids (db/create-array conn "uuid" ids)] ids (db/create-array conn "uuid" ids)]
(->> (db/exec! conn [sql:get-teams ids]) (->> (db/exec! conn [sql:get-teams ids])
(map decode-row)))) (map decode-row-features))))
(defn get-team (defn get-team
[cfg team-id] [cfg team-id]
(-> (db/get cfg :team {:id team-id}) (-> (db/get cfg :team {:id team-id})
(decode-row))) (decode-row-features)))
(defn get-fonts (defn get-fonts
[cfg team-id] [cfg team-id]
@@ -311,7 +440,6 @@
(do (do
(l/trc :hint "lookup index" (l/trc :hint "lookup index"
:file-id (str file-id) :file-id (str file-id)
:snap-id (str (:snapshot-id file))
:id (str id) :id (str id)
:result (str (get mobj :id))) :result (str (get mobj :id)))
(get mobj :id)) (get mobj :id))
@@ -328,7 +456,6 @@
(doseq [[old-id item] missing-index] (doseq [[old-id item] missing-index]
(l/dbg :hint "create missing references" (l/dbg :hint "create missing references"
:file-id (str file-id) :file-id (str file-id)
:snap-id (str (:snapshot-id file))
:old-id (str old-id) :old-id (str old-id)
:id (str (:id item))) :id (str (:id item)))
(db/insert! conn :file-media-object item (db/insert! conn :file-media-object item
@@ -339,12 +466,16 @@
(def sql:get-file-media (def sql:get-file-media
"SELECT * FROM file_media_object WHERE id = ANY(?)") "SELECT * FROM file_media_object WHERE id = ANY(?)")
(defn get-file-media (defn get-file-media*
[cfg {:keys [data] :as file}] [{:keys [::db/conn] :as cfg} {:keys [data id] :as file}]
(db/run! cfg (fn [{:keys [::db/conn]}]
(let [used (cfh/collect-used-media data) (let [used (cfh/collect-used-media data)
used (db/create-array conn "uuid" used)] used (db/create-array conn "uuid" used)]
(db/exec! conn [sql:get-file-media used]))))) (->> (db/exec! conn [sql:get-file-media used])
(mapv (fn [row] (assoc row :file-id id))))))
(defn get-file-media
[cfg file]
(db/run! cfg get-file-media* file))
(def ^:private sql:get-team-files-ids (def ^:private sql:get-team-files-ids
"SELECT f.id FROM file AS f "SELECT f.id FROM file AS f
@@ -475,8 +606,8 @@
;; all of them, not only the applied ;; all of them, not only the applied
(vary-meta dissoc ::fmg/migrated)))) (vary-meta dissoc ::fmg/migrated))))
(defn encode-file (defn- encode-file
[{:keys [::wrk/executor] :as cfg} {:keys [id features] :as file}] [cfg {:keys [id features] :as file}]
(let [file (if (and (contains? features "fdata/objects-map") (let [file (if (and (contains? features "fdata/objects-map")
(:data file)) (:data file))
(fdata/enable-objects-map file) (fdata/enable-objects-map file)
@@ -493,18 +624,33 @@
(-> file (-> file
(d/update-when :features into-array) (d/update-when :features into-array)
(d/update-when :data (fn [data] (px/invoke! executor #(blob/encode data))))))) (d/update-when :data blob/encode))))
(defn- file->params (defn- file->params
[file] [file]
(-> (select-keys file file-attrs) (-> (select-keys file file-attrs)
(assoc :data nil)
(dissoc :team-id) (dissoc :team-id)
(dissoc :migrations))) (dissoc :migrations)))
(defn- file->file-data-params
[{:keys [id] :as file} & {:as opts}]
(let [created-at (or (:created-at file) (ct/now))
modified-at (or (:modified-at file) created-at)]
(d/without-nils
{:id id
:type "main"
:file-id id
:data (:data file)
:metadata (:metadata file)
:created-at created-at
:modified-at modified-at})))
(defn insert-file! (defn insert-file!
"Insert a new file into the database table. Expectes a not-encoded file. "Insert a new file into the database table. Expectes a not-encoded file.
Returns nil." Returns nil."
[{:keys [::db/conn] :as cfg} file & {:as opts}] [{:keys [::db/conn] :as cfg} file & {:as opts}]
(db/exec-one! conn ["SET CONSTRAINTS ALL DEFERRED"])
(when (:migrations file) (when (:migrations file)
(fmigr/upsert-migrations! conn file)) (fmigr/upsert-migrations! conn file))
@@ -512,35 +658,43 @@
(let [file (encode-file cfg file)] (let [file (encode-file cfg file)]
(db/insert! conn :file (db/insert! conn :file
(file->params file) (file->params file)
{::db/return-keys false}) (assoc opts ::db/return-keys false))
(->> (file->file-data-params file)
(fdata/upsert! cfg))
nil)) nil))
(defn update-file! (defn update-file!
"Update an existing file on the database. Expects not encoded file." "Update an existing file on the database. Expects not encoded file."
[{:keys [::db/conn] :as cfg} {:keys [id] :as file} & {:as opts}] [{:keys [::db/conn] :as cfg} {:keys [id] :as file} & {:as opts}]
(if (::reset-migrations opts false) (if (::reset-migrations? opts false)
(fmigr/reset-migrations! conn file) (fmigr/reset-migrations! conn file)
(fmigr/upsert-migrations! conn file)) (fmigr/upsert-migrations! conn file))
(let [file (let [file
(encode-file cfg file) (encode-file cfg file)
params file-params
(file->params (dissoc file :id))] (file->params (dissoc file :id))
(db/update! conn :file params file-data-params
(file->file-data-params file)]
(db/update! conn :file file-params
{:id id} {:id id}
{::db/return-keys false}) {::db/return-keys false})
(fdata/upsert! cfg file-data-params)
nil)) nil))
(defn save-file! (defn save-file!
"Applies all the final validations and perist the file, binfile "Applies all the final validations and perist the file, binfile
specific, should not be used outside of binfile domain. specific, should not be used outside of binfile domain.
Returns nil" Returns nil"
[{:keys [::timestamp] :as cfg} file & {:as opts}] [{:keys [::timestamp] :as cfg} file & {:as opts}]
(assert (ct/inst? timestamp) "expected valid timestamp") (assert (ct/inst? timestamp) "expected valid timestamp")
(let [file (-> file (let [file (-> file
@@ -565,7 +719,7 @@
(l/error :hint "file schema validation error" :cause result)))) (l/error :hint "file schema validation error" :cause result))))
(if (::overwrite cfg) (if (::overwrite cfg)
(update-file! cfg file (assoc opts ::reset-migrations true)) (update-file! cfg file (assoc opts ::reset-migrations? true))
(insert-file! cfg file opts)))) (insert-file! cfg file opts))))
(def ^:private sql:get-file-libraries (def ^:private sql:get-file-libraries
@@ -595,7 +749,7 @@
l.version l.version
FROM libs AS l FROM libs AS l
INNER JOIN project AS p ON (p.id = l.project_id) INNER JOIN project AS p ON (p.id = l.project_id)
WHERE l.deleted_at IS NULL OR l.deleted_at > now();") WHERE l.deleted_at IS NULL;")
(defn get-file-libraries (defn get-file-libraries
[conn file-id] [conn file-id]
@@ -604,7 +758,7 @@
;; FIXME: :is-indirect set to false to all rows looks ;; FIXME: :is-indirect set to false to all rows looks
;; completly useless ;; completly useless
(map #(assoc % :is-indirect false)) (map #(assoc % :is-indirect false))
(map decode-row)) (map decode-row-features))
(db/exec! conn [sql:get-file-libraries file-id]))) (db/exec! conn [sql:get-file-libraries file-id])))
(defn get-resolved-file-libraries (defn get-resolved-file-libraries

View File

@@ -346,7 +346,7 @@
thumbnails (->> (bfc/get-file-object-thumbnails cfg file-id) thumbnails (->> (bfc/get-file-object-thumbnails cfg file-id)
(mapv #(dissoc % :file-id))) (mapv #(dissoc % :file-id)))
file (cond-> (bfc/get-file cfg file-id) file (cond-> (bfc/get-file cfg file-id :realize? true)
detach? detach?
(-> (ctf/detach-external-references file-id) (-> (ctf/detach-external-references file-id)
(dissoc :libraries)) (dissoc :libraries))

View File

@@ -153,7 +153,7 @@
(defn- write-file! (defn- write-file!
[cfg file-id] [cfg file-id]
(let [file (bfc/get-file cfg file-id) (let [file (bfc/get-file cfg file-id :realize? true)
thumbs (bfc/get-file-object-thumbnails cfg file-id) thumbs (bfc/get-file-object-thumbnails cfg file-id)
media (bfc/get-file-media cfg file) media (bfc/get-file-media cfg file)
rels (bfc/get-files-rels cfg #{file-id})] rels (bfc/get-files-rels cfg #{file-id})]

View File

@@ -226,7 +226,9 @@
(let [detach? (and (not embed-assets) (not include-libraries))] (let [detach? (and (not embed-assets) (not include-libraries))]
(db/tx-run! cfg (fn [cfg] (db/tx-run! cfg (fn [cfg]
(cond-> (bfc/get-file cfg file-id {::sql/for-update true}) (cond-> (bfc/get-file cfg file-id
{:realize? true
:lock-for-update? true})
detach? detach?
(-> (ctf/detach-external-references file-id) (-> (ctf/detach-external-references file-id)
(dissoc :libraries)) (dissoc :libraries))
@@ -713,7 +715,7 @@
:plugin-data plugin-data})) :plugin-data plugin-data}))
(defn- import-file (defn- import-file
[{:keys [::bfc/project-id] :as cfg} {file-id :id file-name :name}] [{:keys [::db/conn ::bfc/project-id] :as cfg} {file-id :id file-name :name}]
(let [file-id' (bfc/lookup-index file-id) (let [file-id' (bfc/lookup-index file-id)
file (read-file cfg file-id) file (read-file cfg file-id)
media (read-file-media cfg file-id) media (read-file-media cfg file-id)
@@ -726,26 +728,48 @@
:version (:version file) :version (:version file)
::l/sync? true) ::l/sync? true)
(events/tap :progress {:section :file :name file-name}) (vswap! bfc/*state* update :index bfc/update-index media :id)
(when media (events/tap :progress {:section :media :file-id file-id})
;; Update index with media
(l/dbg :hint "update media index" (doseq [item media]
(let [params (-> item
(update :id bfc/lookup-index)
(assoc :file-id file-id')
(d/update-when :media-id bfc/lookup-index)
(d/update-when :thumbnail-id bfc/lookup-index))]
(l/dbg :hint "inserting media object"
:file-id (str file-id') :file-id (str file-id')
:total (count media) :id (str (:id params))
:media-id (str (:media-id params))
:thumbnail-id (str (:thumbnail-id params))
:old-id (str (:id item))
::l/sync? true) ::l/sync? true)
(vswap! bfc/*state* update :index bfc/update-index (map :id media)) (db/insert! conn :file-media-object params
(vswap! bfc/*state* update :media into media)) ::db/on-conflict-do-nothing? (::bfc/overwrite cfg))))
(when thumbnails (events/tap :progress {:section :thumbnails :file-id file-id})
(l/dbg :hint "update thumbnails index"
(doseq [item thumbnails]
(let [media-id (bfc/lookup-index (:media-id item))
object-id (-> (assoc item :file-id file-id')
(cth/fmt-object-id))
params {:file-id file-id'
:object-id object-id
:tag (:tag item)
:media-id media-id}]
(l/dbg :hint "inserting object thumbnail"
:file-id (str file-id') :file-id (str file-id')
:total (count thumbnails) :media-id (str media-id)
::l/sync? true) ::l/sync? true)
(vswap! bfc/*state* update :index bfc/update-index (map :media-id thumbnails)) (db/insert! conn :file-tagged-object-thumbnail params
(vswap! bfc/*state* update :thumbnails into thumbnails)) ::db/on-conflict-do-nothing? true)))
(events/tap :progress {:section :file :file-id file-id})
(let [data (-> (read-file-data cfg file-id) (let [data (-> (read-file-data cfg file-id)
(d/without-nils) (d/without-nils)
@@ -796,16 +820,9 @@
(doseq [{:keys [id entry]} entries] (doseq [{:keys [id entry]} entries]
(let [object (->> (read-entry input entry) (let [object (->> (read-entry input entry)
(decode-storage-object) (decode-storage-object)
(validate-storage-object))] (validate-storage-object))
(when (not= id (:id object)) ext (cmedia/mtype->extension (:content-type object))
(ex/raise :type :validation
:code :inconsistent-penpot-file
:hint "the penpot file seems corrupt, found unexpected uuid (storage-object-id)"
:expected-id (str id)
:found-id (str (:id object))))
(let [ext (cmedia/mtype->extension (:content-type object))
path (str "objects/" id ext) path (str "objects/" id ext)
content (->> path content (->> path
(get-zip-entry input) (get-zip-entry input)
@@ -841,48 +858,7 @@
:bucket (:bucket params) :bucket (:bucket params)
::l/sync? true) ::l/sync? true)
(vswap! bfc/*state* update :index assoc id (:id sobject)))))))) (vswap! bfc/*state* update :index assoc id (:id sobject)))))))
(defn- import-file-media
[{:keys [::db/conn] :as cfg}]
(events/tap :progress {:section :media})
(doseq [item (:media @bfc/*state*)]
(let [params (-> item
(update :id bfc/lookup-index)
(update :file-id bfc/lookup-index)
(d/update-when :media-id bfc/lookup-index)
(d/update-when :thumbnail-id bfc/lookup-index))]
(l/dbg :hint "inserting file media object"
:old-id (str (:id item))
:id (str (:id params))
:file-id (str (:file-id params))
::l/sync? true)
(db/insert! conn :file-media-object params
::db/on-conflict-do-nothing? (::bfc/overwrite cfg)))))
(defn- import-file-thumbnails
[{:keys [::db/conn] :as cfg}]
(events/tap :progress {:section :thumbnails})
(doseq [item (:thumbnails @bfc/*state*)]
(let [file-id (bfc/lookup-index (:file-id item))
media-id (bfc/lookup-index (:media-id item))
object-id (-> (assoc item :file-id file-id)
(cth/fmt-object-id))
params {:file-id file-id
:object-id object-id
:tag (:tag item)
:media-id media-id}]
(l/dbg :hint "inserting file object thumbnail"
:file-id (str file-id)
:media-id (str media-id)
::l/sync? true)
(db/insert! conn :file-tagged-object-thumbnail params
{::db/on-conflict-do-nothing? true}))))
(defn- import-files* (defn- import-files*
[{:keys [::manifest] :as cfg}] [{:keys [::manifest] :as cfg}]
@@ -890,6 +866,8 @@
(vswap! bfc/*state* update :index bfc/update-index (:files manifest) :id) (vswap! bfc/*state* update :index bfc/update-index (:files manifest) :id)
(import-storage-objects cfg)
(let [files (get manifest :files) (let [files (get manifest :files)
result (reduce (fn [result {:keys [id] :as file}] result (reduce (fn [result {:keys [id] :as file}]
(let [name' (get file :name) (let [name' (get file :name)
@@ -902,10 +880,6 @@
files)] files)]
(import-file-relations cfg) (import-file-relations cfg)
(import-storage-objects cfg)
(import-file-media cfg)
(import-file-thumbnails cfg)
(bfm/apply-pending-migrations! cfg) (bfm/apply-pending-migrations! cfg)
result)) result))
@@ -930,9 +904,8 @@
(binding [bfc/*options* cfg (binding [bfc/*options* cfg
bfc/*reference-file* ref-file] bfc/*reference-file* ref-file]
(import-file cfg file)
(import-storage-objects cfg) (import-storage-objects cfg)
(import-file-media cfg) (import-file cfg file)
(bfc/invalidate-thumbnails cfg file-id) (bfc/invalidate-thumbnails cfg file-id)
(bfm/apply-pending-migrations! cfg) (bfm/apply-pending-migrations! cfg)

View File

@@ -52,6 +52,8 @@
:redis-uri "redis://redis/0" :redis-uri "redis://redis/0"
:file-data-backend "legacy-db"
:objects-storage-backend "fs" :objects-storage-backend "fs"
:objects-storage-fs-directory "assets" :objects-storage-fs-directory "assets"
@@ -96,7 +98,9 @@
[:http-server-max-body-size {:optional true} ::sm/int] [:http-server-max-body-size {:optional true} ::sm/int]
[:http-server-max-multipart-body-size {:optional true} ::sm/int] [:http-server-max-multipart-body-size {:optional true} ::sm/int]
[:http-server-io-threads {:optional true} ::sm/int] [:http-server-io-threads {:optional true} ::sm/int]
[:http-server-worker-threads {:optional true} ::sm/int] [:http-server-max-worker-threads {:optional true} ::sm/int]
[:management-api-shared-key {:optional true} :string]
[:telemetry-uri {:optional true} :string] [:telemetry-uri {:optional true} :string]
[:telemetry-with-taiga {:optional true} ::sm/boolean] ;; DELETE [:telemetry-with-taiga {:optional true} ::sm/boolean] ;; DELETE
@@ -105,7 +109,8 @@
[:auto-file-snapshot-timeout {:optional true} ::ct/duration] [:auto-file-snapshot-timeout {:optional true} ::ct/duration]
[:media-max-file-size {:optional true} ::sm/int] [:media-max-file-size {:optional true} ::sm/int]
[:deletion-delay {:optional true} ::ct/duration] ;; REVIEW [:deletion-delay {:optional true} ::ct/duration]
[:file-clean-delay {:optional true} ::ct/duration]
[:telemetry-enabled {:optional true} ::sm/boolean] [:telemetry-enabled {:optional true} ::sm/boolean]
[:default-blob-version {:optional true} ::sm/int] [:default-blob-version {:optional true} ::sm/int]
[:allow-demo-users {:optional true} ::sm/boolean] [:allow-demo-users {:optional true} ::sm/boolean]
@@ -146,7 +151,6 @@
[:quotes-team-access-requests-per-team {:optional true} ::sm/int] [:quotes-team-access-requests-per-team {:optional true} ::sm/int]
[:quotes-team-access-requests-per-requester {:optional true} ::sm/int] [:quotes-team-access-requests-per-requester {:optional true} ::sm/int]
[:auth-data-cookie-domain {:optional true} :string]
[:auth-token-cookie-name {:optional true} :string] [:auth-token-cookie-name {:optional true} :string]
[:auth-token-cookie-max-age {:optional true} ::ct/duration] [:auth-token-cookie-max-age {:optional true} ::ct/duration]
@@ -210,24 +214,27 @@
[:prepl-host {:optional true} :string] [:prepl-host {:optional true} :string]
[:prepl-port {:optional true} ::sm/int] [:prepl-port {:optional true} ::sm/int]
[:file-data-backend {:optional true} [:enum "db" "legacy-db" "storage"]]
[:media-directory {:optional true} :string] ;; REVIEW [:media-directory {:optional true} :string] ;; REVIEW
[:media-uri {:optional true} :string] [:media-uri {:optional true} :string]
[:assets-path {:optional true} :string] [:assets-path {:optional true} :string]
;; Legacy, will be removed in 2.5 [:netty-io-threads {:optional true} ::sm/int]
[:executor-threads {:optional true} ::sm/int]
;; DEPRECATED
[:assets-storage-backend {:optional true} :keyword] [:assets-storage-backend {:optional true} :keyword]
[:storage-assets-fs-directory {:optional true} :string] [:storage-assets-fs-directory {:optional true} :string]
[:storage-assets-s3-bucket {:optional true} :string] [:storage-assets-s3-bucket {:optional true} :string]
[:storage-assets-s3-region {:optional true} :keyword] [:storage-assets-s3-region {:optional true} :keyword]
[:storage-assets-s3-endpoint {:optional true} ::sm/uri] [:storage-assets-s3-endpoint {:optional true} ::sm/uri]
[:storage-assets-s3-io-threads {:optional true} ::sm/int]
[:objects-storage-backend {:optional true} :keyword] [:objects-storage-backend {:optional true} :keyword]
[:objects-storage-fs-directory {:optional true} :string] [:objects-storage-fs-directory {:optional true} :string]
[:objects-storage-s3-bucket {:optional true} :string] [:objects-storage-s3-bucket {:optional true} :string]
[:objects-storage-s3-region {:optional true} :keyword] [:objects-storage-s3-region {:optional true} :keyword]
[:objects-storage-s3-endpoint {:optional true} ::sm/uri] [:objects-storage-s3-endpoint {:optional true} ::sm/uri]]))
[:objects-storage-s3-io-threads {:optional true} ::sm/int]]))
(defn- parse-flags (defn- parse-flags
[config] [config]
@@ -300,6 +307,11 @@
(or (c/get config :deletion-delay) (or (c/get config :deletion-delay)
(ct/duration {:days 7}))) (ct/duration {:days 7})))
(defn get-file-clean-delay
[]
(or (c/get config :file-clean-delay)
(ct/duration {:days 2})))
(defn get (defn get
"A configuration getter. Helps code be more testable." "A configuration getter. Helps code be more testable."
([key] ([key]

View File

@@ -298,7 +298,7 @@
(defn insert! (defn insert!
"A helper that builds an insert sql statement and executes it. By "A helper that builds an insert sql statement and executes it. By
default returns the inserted row with all the field; you can delimit default returns the inserted row with all the field; you can delimit
the returned columns with the `::columns` option." the returned columns with the `::sql/columns` option."
[ds table params & {:as opts}] [ds table params & {:as opts}]
(let [conn (get-connectable ds) (let [conn (get-connectable ds)
sql (sql/insert table params opts) sql (sql/insert table params opts)
@@ -379,9 +379,7 @@
(defn is-row-deleted? (defn is-row-deleted?
[{:keys [deleted-at]}] [{:keys [deleted-at]}]
(and (ct/inst? deleted-at) (some? deleted-at))
(< (inst-ms deleted-at)
(inst-ms (ct/now)))))
(defn get* (defn get*
"Retrieve a single row from database that matches a simple filters. Do "Retrieve a single row from database that matches a simple filters. Do
@@ -406,10 +404,10 @@
:hint "database object not found")) :hint "database object not found"))
row)) row))
(defn get-with-sql (defn get-with-sql
[ds sql & {:as opts}] [ds sql & {:as opts}]
(let [rows (cond->> (exec! ds sql opts) (let [rows
(cond->> (exec! ds sql opts)
(::remove-deleted opts true) (::remove-deleted opts true)
(remove is-row-deleted?) (remove is-row-deleted?)
@@ -423,7 +421,6 @@
(first rows))) (first rows)))
(def ^:private default-plan-opts (def ^:private default-plan-opts
(-> default-opts (-> default-opts
(assoc :fetch-size 1000) (assoc :fetch-size 1000)
@@ -578,10 +575,10 @@
[system f & params] [system f & params]
(cond (cond
(connection? system) (connection? system)
(run! {::conn system} f) (apply run! {::conn system} f params)
(pool? system) (pool? system)
(run! {::pool system} f) (apply run! {::pool system} f params)
(::conn system) (::conn system)
(apply f system params) (apply f system params)

View File

@@ -9,32 +9,46 @@
(:require (:require
[app.common.data :as d] [app.common.data :as d]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.files.helpers :as cfh]
[app.common.files.migrations :as fmg]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.schema :as sm]
[app.common.time :as ct]
[app.common.types.objects-map :as omap] [app.common.types.objects-map :as omap]
[app.common.types.path :as path] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.db.sql :as-alias sql] [app.db.sql :as-alias sql]
[app.storage :as sto] [app.storage :as sto]
[app.util.blob :as blob] [app.util.blob :as blob]
[app.util.objects-map :as omap.legacy] [app.util.objects-map :as omap.legacy]
[app.util.pointer-map :as pmap] [app.util.pointer-map :as pmap]))
[app.worker :as wrk]
[promesa.exec :as px]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; OFFLOAD
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn offloaded?
[file]
(= "objects-storage" (:data-backend file)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; OBJECTS-MAP ;; OBJECTS-MAP
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn process-objects
"Apply a function to all objects-map on the file. Usualy used for convert
the objects-map instances to plain maps"
[fdata update-fn]
(if (contains? fdata :pages-index)
(update fdata :pages-index d/update-vals
(fn [page]
(update page :objects
(fn [objects]
(if (or (omap/objects-map? objects)
(omap.legacy/objects-map? objects))
(update-fn objects)
objects)))))
fdata))
(defn realize-objects
"Process a file and remove all instances of objects map realizing them
to a plain data. Used in operation where is more efficient have the
whole file loaded in memory or we going to persist it in an
alterantive storage."
[_cfg file]
(update file :data process-objects (partial into {})))
(defn enable-objects-map (defn enable-objects-map
[file & _opts] [file & _opts]
(let [update-page (let [update-page
@@ -63,51 +77,194 @@
(update :data update-data) (update :data update-data)
(update :features disj "fdata/objects-map")))) (update :features disj "fdata/objects-map"))))
(defn process-objects ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
"Apply a function to all objects-map on the file. Usualy used for convert ;; STORAGE
the objects-map instances to plain maps" ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
[fdata update-fn]
(if (contains? fdata :pages-index) (defmulti resolve-file-data
(update fdata :pages-index d/update-vals (fn [_cfg file] (get file :backend "legacy-db")))
(fn [page]
(update page :objects (defmethod resolve-file-data "legacy-db"
(fn [objects] [_cfg {:keys [legacy-data] :as file}]
(if (or (omap/objects-map? objects) (-> file
(omap.legacy/objects-map? objects)) (assoc :data legacy-data)
(update-fn objects) (dissoc :legacy-data)))
objects)))))
fdata)) (defmethod resolve-file-data "db"
[_cfg file]
(dissoc file :legacy-data))
(defmethod resolve-file-data "storage"
[cfg {:keys [metadata] :as file}]
(let [storage (sto/resolve cfg ::db/reuse-conn true)
ref-id (:storage-ref-id metadata)
data (->> (sto/get-object storage ref-id)
(sto/get-object-bytes storage))]
(-> file
(assoc :data data)
(dissoc :legacy-data))))
(defn decode-file-data
[_cfg {:keys [data] :as file}]
(cond-> file
(bytes? data)
(assoc :data (blob/decode data))))
(def ^:private sql:insert-file-data
"INSERT INTO file_data (file_id, id, created_at, modified_at, deleted_at,
type, backend, metadata, data)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)")
(def ^:private sql:upsert-file-data
(str sql:insert-file-data
" ON CONFLICT (file_id, id)
DO UPDATE SET modified_at=?,
deleted_at=?,
backend=?,
metadata=?,
data=?"))
(defn- upsert-in-database
[cfg {:keys [id file-id created-at modified-at deleted-at type backend data metadata]}]
(let [created-at (or created-at (ct/now))
metadata (some-> metadata db/json)
modified-at (or modified-at created-at)]
(db/exec-one! cfg [sql:upsert-file-data
file-id id
created-at
modified-at
deleted-at
type
backend
metadata
data
modified-at
deleted-at
backend
metadata
data])))
(defn- handle-persistence
[cfg {:keys [type backend id file-id data] :as params}]
(cond
(= backend "storage")
(let [storage (sto/resolve cfg)
content (sto/content data)
sobject (sto/put-object! storage
{::sto/content content
::sto/touch true
:bucket "file-data"
:content-type "application/octet-stream"
:file-id file-id
:id id})
metadata {:storage-ref-id (:id sobject)}
params (-> params
(assoc :metadata metadata)
(assoc :data nil))]
(upsert-in-database cfg params))
(= backend "db")
(->> (dissoc params :metadata)
(upsert-in-database cfg))
(= backend "legacy-db")
(cond
(= type "main")
(do
(db/delete! cfg :file-data
{:id id :file-id file-id :type "main"}
{::db/return-keys false})
(db/update! cfg :file
{:data data}
{:id file-id}
{::db/return-keys false}))
(= type "snapshot")
(do
(db/delete! cfg :file-data
{:id id :file-id file-id :type "snapshot"}
{::db/return-keys false})
(db/update! cfg :file-change
{:data data}
{:file-id file-id :id id}
{::db/return-keys false}))
(= type "fragment")
(upsert-in-database cfg
(-> (dissoc params :metadata)
(assoc :backend "db")))
:else
(throw (RuntimeException. "not implemented")))
:else
(throw (IllegalArgumentException.
(str "backend '" backend "' not supported")))))
(defn process-metadata
[cfg metadata]
(when-let [storage-id (:storage-ref-id metadata)]
(let [storage (sto/resolve cfg ::db/reuse-conn true)]
(sto/touch-object! storage storage-id))))
(defn- default-backend
[backend]
(or backend (cf/get :file-data-backend)))
(def ^:private schema:metadata
[:map {:title "Metadata"}
[:storage-ref-id {:optional true} ::sm/uuid]])
(def decode-metadata-with-schema
(sm/decoder schema:metadata sm/json-transformer))
(defn decode-metadata
[metadata]
(some-> metadata
(db/decode-json-pgobject)
(decode-metadata-with-schema)))
(def ^:private schema:update-params
[:map {:closed true}
[:id ::sm/uuid]
[:type [:enum "main" "snapshot" "fragment"]]
[:file-id ::sm/uuid]
[:backend {:optional true} [:enum "db" "legacy-db" "storage"]]
[:metadata {:optional true} [:maybe schema:metadata]]
[:data {:optional true} bytes?]
[:created-at {:optional true} ::ct/inst]
[:modified-at {:optional true} [:maybe ::ct/inst]]
[:deleted-at {:optional true} [:maybe ::ct/inst]]])
(def ^:private check-update-params
(sm/check-fn schema:update-params :hint "invalid params received for update"))
(defn upsert!
"Create or update file data"
[cfg params & {:as opts}]
(let [params (-> (check-update-params params)
(update :backend default-backend))]
(some->> (:metadata params)
(process-metadata cfg))
(-> (handle-persistence cfg params)
(db/get-update-count)
(pos?))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; POINTER-MAP ;; POINTER-MAP
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn get-file-data
"Get file data given a file instance."
[system file]
(if (offloaded? file)
(let [storage (sto/resolve system ::db/reuse-conn true)]
(->> (sto/get-object storage (:data-ref-id file))
(sto/get-object-bytes storage)))
(:data file)))
(defn resolve-file-data
[system file]
(let [data (get-file-data system file)]
(assoc file :data data)))
(defn decode-file-data
[{:keys [::wrk/executor]} {:keys [data] :as file}]
(cond-> file
(bytes? data)
(assoc :data (px/invoke! executor #(blob/decode data)))))
(defn load-pointer (defn load-pointer
"A database loader pointer helper" "A database loader pointer helper"
[system file-id id] [cfg file-id id]
(let [fragment (db/get* system :file-data-fragment (let [fragment (some-> (db/get* cfg :file-data
{:id id :file-id file-id} {:id id :file-id file-id :type "fragment"}
{::sql/columns [:data :data-backend :data-ref-id :id]})] {::sql/columns [:data :backend :id :metadata]})
(update :metadata decode-metadata))]
(l/trc :hint "load pointer" (l/trc :hint "load pointer"
:file-id (str file-id) :file-id (str file-id)
@@ -121,22 +278,21 @@
:file-id file-id :file-id file-id
:fragment-id id)) :fragment-id id))
(let [data (get-file-data system fragment)] (-> (resolve-file-data cfg fragment)
;; FIXME: conditional thread scheduling for decoding big objects (get :data)
(blob/decode data)))) (blob/decode))))
(defn persist-pointers! (defn persist-pointers!
"Persist all currently tracked pointer objects" "Persist all currently tracked pointer objects"
[system file-id] [cfg file-id]
(let [conn (db/get-connection system)]
(doseq [[id item] @pmap/*tracked*] (doseq [[id item] @pmap/*tracked*]
(when (pmap/modified? item) (when (pmap/modified? item)
(l/trc :hint "persist pointer" :file-id (str file-id) :id (str id)) (l/trc :hint "persist pointer" :file-id (str file-id) :id (str id))
(let [content (-> item deref blob/encode)] (let [content (-> item deref blob/encode)]
(db/insert! conn :file-data-fragment (upsert! cfg {:id id
{:id id
:file-id file-id :file-id file-id
:data content})))))) :type "fragment"
:data content})))))
(defn process-pointers (defn process-pointers
"Apply a function to all pointers on the file. Usuly used for "Apply a function to all pointers on the file. Usuly used for
@@ -150,6 +306,14 @@
(d/update-vals update-fn') (d/update-vals update-fn')
(update :pages-index d/update-vals update-fn')))) (update :pages-index d/update-vals update-fn'))))
(defn realize-pointers
"Process a file and remove all instances of pointers realizing them to
a plain data. Used in operation where is more efficient have the
whole file loaded in memory."
[cfg {:keys [id] :as file}]
(binding [pmap/*load-fn* (partial load-pointer cfg id)]
(update file :data process-pointers deref)))
(defn get-used-pointer-ids (defn get-used-pointer-ids
"Given a file, return all pointer ids used in the data." "Given a file, return all pointer ids used in the data."
[fdata] [fdata]
@@ -169,47 +333,12 @@
(update :features conj "fdata/pointer-map"))) (update :features conj "fdata/pointer-map")))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PATH-DATA ;; GENERAL PURPOSE HELPERS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn enable-path-data (defn realize
"Enable the fdata/path-data feature on the file." "A helper that combines realize-pointers and realize-objects"
[file & _opts] [cfg file]
(letfn [(update-object [object] (->> file
(if (or (cfh/path-shape? object) (realize-pointers cfg)
(cfh/bool-shape? object)) (realize-objects cfg)))
(update object :content path/content)
object))
(update-container [container]
(d/update-when container :objects d/update-vals update-object))]
(-> file
(update :data (fn [data]
(-> data
(update :pages-index d/update-vals update-container)
(d/update-when :components d/update-vals update-container))))
(update :features conj "fdata/path-data"))))
(defn disable-path-data
[file & _opts]
(letfn [(update-object [object]
(if (or (cfh/path-shape? object)
(cfh/bool-shape? object))
(update object :content vec)
object))
(update-container [container]
(d/update-when container :objects d/update-vals update-object))]
(when-let [conn db/*conn*]
(db/delete! conn :file-migration {:file-id (:id file)
:name "0003-convert-path-content"}))
(-> file
(update :data (fn [data]
(-> data
(update :pages-index d/update-vals update-container)
(d/update-when :components d/update-vals update-container))))
(update :features disj "fdata/path-data")
(update :migrations disj "0003-convert-path-content")
(vary-meta update ::fmg/migrated disj "0003-convert-path-content"))))

View File

@@ -0,0 +1,446 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.features.file-snapshots
(:require
[app.binfile.common :as bfc]
[app.common.data :as d]
[app.common.exceptions :as ex]
[app.common.features :as-alias cfeat]
[app.common.files.migrations :as fmg]
[app.common.logging :as l]
[app.common.schema :as sm]
[app.common.time :as ct]
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.db.sql :as-alias sql]
[app.features.fdata :as fdata]
[app.storage :as sto]
[app.util.blob :as blob]
[app.worker :as wrk]
[cuerdas.core :as str]))
(def sql:snapshots
"SELECT c.id,
c.label,
c.created_at,
c.updated_at AS modified_at,
c.deleted_at,
c.profile_id,
c.created_by,
c.locked_by,
c.revn,
c.features,
c.migrations,
c.version,
c.file_id,
c.data AS legacy_data,
fd.data AS data,
coalesce(fd.backend, 'legacy-db') AS backend,
fd.metadata AS metadata
FROM file_change AS c
LEFT JOIN file_data AS fd ON (fd.file_id = c.file_id
AND fd.id = c.id
AND fd.type = 'snapshot')
WHERE c.label IS NOT NULL")
(defn- decode-snapshot
[snapshot]
(some-> snapshot
(-> (d/update-when :metadata fdata/decode-metadata)
(d/update-when :migrations db/decode-pgarray [])
(d/update-when :features db/decode-pgarray #{}))))
(def ^:private sql:get-minimal-file
"SELECT f.id,
f.revn,
f.modified_at,
f.deleted_at,
fd.backend AS backend,
fd.metadata AS metadata
FROM file AS f
LEFT JOIN file_data AS fd ON (fd.file_id = f.id AND fd.id = f.id)
WHERE f.id = ?")
(defn- get-minimal-file
[cfg id & {:as opts}]
(-> (db/get-with-sql cfg [sql:get-minimal-file id] opts)
(d/update-when :metadata fdata/decode-metadata)))
(def ^:private sql:get-snapshot-without-data
(str "WITH snapshots AS (" sql:snapshots ")"
"SELECT c.id,
c.label,
c.revn,
c.created_at,
c.modified_at,
c.deleted_at,
c.profile_id,
c.created_by,
c.locked_by,
c.features,
c.metadata,
c.migrations,
c.version,
c.file_id
FROM snapshots AS c
WHERE c.id = ?
AND CASE WHEN c.created_by = 'user'
THEN c.deleted_at IS NULL
WHEN c.created_by = 'system'
THEN c.deleted_at IS NULL OR c.deleted_at >= ?::timestamptz
END"))
(defn get-minimal-snapshot
[cfg snapshot-id]
(let [now (ct/now)]
(-> (db/get-with-sql cfg [sql:get-snapshot-without-data snapshot-id now]
{::db/remove-deleted false})
(decode-snapshot))))
(def ^:private sql:get-snapshot
(str sql:snapshots
" AND c.file_id = ?
AND c.id = ?
AND CASE WHEN c.created_by = 'user'
THEN (c.deleted_at IS NULL)
WHEN c.created_by = 'system'
THEN (c.deleted_at IS NULL OR c.deleted_at >= ?::timestamptz)
END"))
(defn- get-snapshot
"Get snapshot with decoded data"
[cfg file-id snapshot-id]
(let [now (ct/now)]
(->> (db/get-with-sql cfg [sql:get-snapshot file-id snapshot-id now]
{::db/remove-deleted false})
(decode-snapshot)
(fdata/resolve-file-data cfg)
(fdata/decode-file-data cfg))))
(def ^:private sql:get-visible-snapshots
(str "WITH "
"snapshots1 AS ( " sql:snapshots "),"
"snapshots2 AS (
SELECT c.id,
c.label,
c.revn,
c.version,
c.created_at,
c.modified_at,
c.created_by,
c.locked_by,
c.profile_id,
c.deleted_at
FROM snapshots1 AS c
WHERE c.file_id = ?
), snapshots3 AS (
(SELECT * FROM snapshots2
WHERE created_by = 'system'
AND (deleted_at IS NULL OR
deleted_at >= ?::timestamptz)
LIMIT 500)
UNION ALL
(SELECT * FROM snapshots2
WHERE created_by = 'user'
AND deleted_at IS NULL
LIMIT 500)
)
SELECT * FROM snapshots3
ORDER BY created_at DESC"))
(defn get-visible-snapshots
"Return a list of snapshots fecheable from the API, it has a limited
set of fields and applies big but safe limits over all available
snapshots. It return a ordered vector by the snapshot date of
creation."
[cfg file-id]
(let [now (ct/now)]
(->> (db/exec! cfg [sql:get-visible-snapshots file-id now])
(mapv decode-snapshot))))
(def ^:private schema:decoded-file
[:map {:title "DecodedFile"}
[:id ::sm/uuid]
[:revn :int]
[:vern :int]
[:data :map]
[:version :int]
[:features ::cfeat/features]
[:migrations [::sm/set :string]]])
(def ^:private schema:snapshot
[:map {:title "Snapshot"}
[:id ::sm/uuid]
[:revn [::sm/int {:min 0}]]
[:version [::sm/int {:min 0}]]
[:features ::cfeat/features]
[:migrations [::sm/set ::sm/text]]
[:profile-id {:optional true} ::sm/uuid]
[:label ::sm/text]
[:file-id ::sm/uuid]
[:created-by [:enum "system" "user" "admin"]]
[:deleted-at {:optional true} ::ct/inst]
[:modified-at ::ct/inst]
[:created-at ::ct/inst]])
(def ^:private check-snapshot
(sm/check-fn schema:snapshot))
(def ^:private check-decoded-file
(sm/check-fn schema:decoded-file))
(defn- generate-snapshot-label
[]
(let [ts (-> (ct/now)
(ct/format-inst)
(str/replace #"[T:\.]" "-")
(str/rtrim "Z"))]
(str "snapshot-" ts)))
(def ^:private schema:create-params
[:map {:title "SnapshotCreateParams"}
[:profile-id ::sm/uuid]
[:created-by {:optional true} [:enum "user" "system"]]
[:label {:optional true} ::sm/text]
[:session-id {:optional true} ::sm/uuid]
[:modified-at {:optional true} ::ct/inst]
[:deleted-at {:optional true} ::ct/inst]])
(def ^:private check-create-params
(sm/check-fn schema:create-params))
(defn create!
"Create a file snapshot; expects a non-encoded file"
[cfg file & {:as params}]
(let [{:keys [label created-by deleted-at profile-id session-id]}
(check-create-params params)
file
(check-decoded-file file)
created-by
(or created-by "system")
snapshot-id
(uuid/next)
created-at
(ct/now)
deleted-at
(or deleted-at
(if (= created-by "system")
(ct/in-future (cf/get-deletion-delay))
nil))
label
(or label (generate-snapshot-label))
snapshot
(cond-> {:id snapshot-id
:revn (:revn file)
:version (:version file)
:file-id (:id file)
:features (:features file)
:migrations (:migrations file)
:label label
:created-at created-at
:modified-at created-at
:created-by created-by}
deleted-at
(assoc :deleted-at deleted-at)
:always
(check-snapshot))]
(db/insert! cfg :file-change
(-> snapshot
(update :features into-array)
(update :migrations into-array)
(assoc :updated-at created-at)
(assoc :profile-id profile-id)
(assoc :session-id session-id)
(dissoc :modified-at))
{::db/return-keys false})
(fdata/upsert! cfg
{:id snapshot-id
:file-id (:id file)
:type "snapshot"
:data (blob/encode (:data file))
:created-at created-at
:deleted-at deleted-at})
snapshot))
(def ^:private schema:update-params
[:map {:title "SnapshotUpdateParams"}
[:id ::sm/uuid]
[:file-id ::sm/uuid]
[:label ::sm/text]
[:modified-at {:optional true} ::ct/inst]])
(def ^:private check-update-params
(sm/check-fn schema:update-params))
(defn update!
[cfg params]
(let [{:keys [id file-id label modified-at]}
(check-update-params params)
modified-at
(or modified-at (ct/now))]
(db/update! cfg :file-data
{:deleted-at nil
:modified-at modified-at}
{:file-id file-id
:id id
:type "snapshot"}
{::db/return-keys false})
(-> (db/update! cfg :file-change
{:label label
:created-by "user"
:updated-at modified-at
:deleted-at nil}
{:file-id file-id
:id id}
{::db/return-keys false})
(db/get-update-count)
(pos?))))
(defn restore!
[{:keys [::db/conn] :as cfg} file-id snapshot-id]
(let [file (get-minimal-file conn file-id {::db/for-update true})
vern (rand-int Integer/MAX_VALUE)
storage
(sto/resolve cfg {::db/reuse-conn true})
snapshot
(get-snapshot cfg file-id snapshot-id)]
(when-not snapshot
(ex/raise :type :not-found
:code :snapshot-not-found
:hint "unable to find snapshot with the provided label"
:snapshot-id snapshot-id
:file-id file-id))
(when-not (:data snapshot)
(ex/raise :type :internal
:code :snapshot-without-data
:hint "snapshot has no data"
:label (:label snapshot)
:file-id file-id))
(let [;; If the snapshot has applied migrations stored, we reuse
;; them, if not, we take a safest set of migrations as
;; starting point. This is because, at the time of
;; implementing snapshots, migrations were not taken into
;; account so we need to make this backward compatible in
;; some way.
migrations
(or (:migrations snapshot)
(fmg/generate-migrations-from-version 67))
file
(-> file
(update :revn inc)
(assoc :migrations migrations)
(assoc :data (:data snapshot))
(assoc :vern vern)
(assoc :version (:version snapshot))
(assoc :has-media-trimmed false)
(assoc :modified-at (:modified-at snapshot))
(assoc :features (:features snapshot)))]
(l/dbg :hint "restoring snapshot"
:file-id (str file-id)
:label (:label snapshot)
:snapshot-id (str (:id snapshot)))
;; In the same way, on reseting the file data, we need to restore
;; the applied migrations on the moment of taking the snapshot
(bfc/update-file! cfg file ::bfc/reset-migrations? true)
;; FIXME: this should be separated functions, we should not have
;; inline sql here.
;; clean object thumbnails
(let [sql (str "update file_tagged_object_thumbnail "
" set deleted_at = now() "
" where file_id=? returning media_id")
res (db/exec! conn [sql file-id])]
(doseq [media-id (into #{} (keep :media-id) res)]
(sto/touch-object! storage media-id)))
;; clean file thumbnails
(let [sql (str "update file_thumbnail "
" set deleted_at = now() "
" where file_id=? returning media_id")
res (db/exec! conn [sql file-id])]
(doseq [media-id (into #{} (keep :media-id) res)]
(sto/touch-object! storage media-id)))
vern)))
(defn delete!
[cfg & {:keys [id file-id deleted-at]}]
(assert (uuid? id) "missing id")
(assert (uuid? file-id) "missing file-id")
(assert (ct/inst? deleted-at) "missing deleted-at")
(wrk/submit! {::db/conn (db/get-connection cfg)
::wrk/task :delete-object
::wrk/params {:object :snapshot
:deleted-at deleted-at
:file-id file-id
:id id}})
(db/update! cfg :file-change
{:deleted-at deleted-at}
{:id id :file-id file-id}
{::db/return-keys false})
true)
(def ^:private sql:get-snapshots
(str sql:snapshots " AND c.file_id = ?"))
(defn lock-by!
[conn id profile-id]
(-> (db/update! conn :file-change
{:locked-by profile-id}
{:id id}
{::db/return-keys false})
(db/get-update-count)
(pos?)))
(defn unlock!
[conn id]
(-> (db/update! conn :file-change
{:locked-by nil}
{:id id}
{::db/return-keys false})
(db/get-update-count)
(pos?)))
(defn reduce-snapshots
"Process the file snapshots using efficient reduction; the file
reduction comes with all snapshots, including maked as deleted"
[cfg file-id xform f init]
(let [conn (db/get-connection cfg)
xform (comp
(map (partial fdata/resolve-file-data cfg))
(map (partial fdata/decode-file-data cfg))
xform)]
(->> (db/plan conn [sql:get-snapshots file-id] {:fetch-size 1})
(transduce xform f init))))

View File

@@ -19,6 +19,7 @@
[app.http.errors :as errors] [app.http.errors :as errors]
[app.http.management :as mgmt] [app.http.management :as mgmt]
[app.http.middleware :as mw] [app.http.middleware :as mw]
[app.http.security :as sec]
[app.http.session :as session] [app.http.session :as session]
[app.http.websocket :as-alias ws] [app.http.websocket :as-alias ws]
[app.main :as-alias main] [app.main :as-alias main]
@@ -26,9 +27,7 @@
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.doc :as-alias rpc.doc] [app.rpc.doc :as-alias rpc.doc]
[app.setup :as-alias setup] [app.setup :as-alias setup]
[app.worker :as wrk]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px]
[reitit.core :as r] [reitit.core :as r]
[reitit.middleware :as rr] [reitit.middleware :as rr]
[yetti.adapter :as yt] [yetti.adapter :as yt]
@@ -55,6 +54,8 @@
[:map [:map
[::port ::sm/int] [::port ::sm/int]
[::host ::sm/text] [::host ::sm/text]
[::io-threads {:optional true} ::sm/int]
[::max-worker-threads {:optional true} ::sm/int]
[::max-body-size {:optional true} ::sm/int] [::max-body-size {:optional true} ::sm/int]
[::max-multipart-body-size {:optional true} ::sm/int] [::max-multipart-body-size {:optional true} ::sm/int]
[::router {:optional true} [:fn r/router?]] [::router {:optional true} [:fn r/router?]]
@@ -65,20 +66,30 @@
(assert (sm/check schema:server-params params))) (assert (sm/check schema:server-params params)))
(defmethod ig/init-key ::server (defmethod ig/init-key ::server
[_ {:keys [::handler ::router ::host ::port ::wrk/executor] :as cfg}] [_ {:keys [::handler ::router ::host ::port ::mtx/metrics] :as cfg}]
(l/info :hint "starting http server" :port port :host host) (l/info :hint "starting http server" :port port :host host)
(let [options {:http/port port (let [on-dispatch
(fn [_ start-at-ns]
(let [timing (- (System/nanoTime) start-at-ns)
timing (int (/ timing 1000000))]
(mtx/run! metrics
:id :http-server-dispatch-timing
:val timing)))
options
{:http/port port
:http/host host :http/host host
:http/max-body-size (::max-body-size cfg) :http/max-body-size (::max-body-size cfg)
:http/max-multipart-body-size (::max-multipart-body-size cfg) :http/max-multipart-body-size (::max-multipart-body-size cfg)
:xnio/direct-buffers false :xnio/direct-buffers false
:xnio/io-threads (or (::io-threads cfg) :xnio/io-threads (::io-threads cfg)
(max 3 (px/get-available-processors))) :xnio/max-worker-threads (::max-worker-threads cfg)
:xnio/dispatch executor
:ring/compat :ring2 :ring/compat :ring2
:events/on-dispatch on-dispatch
:socket/backlog 4069} :socket/backlog 4069}
handler (cond handler
(cond
(some? router) (some? router)
(router-handler router) (router-handler router)
@@ -88,8 +99,8 @@
:else :else
(throw (UnsupportedOperationException. "handler or router are required"))) (throw (UnsupportedOperationException. "handler or router are required")))
options (d/without-nils options) server
server (yt/server handler options)] (yt/server handler (d/without-nils options))]
(assoc cfg ::server (yt/start! server)))) (assoc cfg ::server (yt/start! server))))
@@ -157,6 +168,7 @@
[_ cfg] [_ cfg]
(rr/router (rr/router
[["" {:middleware [[mw/server-timing] [["" {:middleware [[mw/server-timing]
[sec/sec-fetch-metadata]
[mw/params] [mw/params]
[mw/format-response] [mw/format-response]
[session/soft-auth cfg] [session/soft-auth cfg]
@@ -177,7 +189,8 @@
(::ws/routes cfg) (::ws/routes cfg)
["/api" {:middleware [[mw/cors]]} ["/api" {:middleware [[mw/cors]
[sec/client-header-check]]}
(::oidc/routes cfg) (::oidc/routes cfg)
(::rpc.doc/routes cfg) (::rpc.doc/routes cfg)
(::rpc/routes cfg)]]])) (::rpc/routes cfg)]]]))

View File

@@ -14,18 +14,18 @@
[app.tokens :as tokens] [app.tokens :as tokens]
[yetti.request :as yreq])) [yetti.request :as yreq]))
(def header-re #"^Token\s+(.*)") (def header-re #"(?i)^Token\s+(.*)")
(defn- get-token (defn get-token
[request] [request]
(some->> (yreq/get-header request "authorization") (some->> (yreq/get-header request "authorization")
(re-matches header-re) (re-matches header-re)
(second))) (second)))
(defn- decode-token (defn- decode-token
[props token] [cfg token]
(when token (when token
(tokens/verify props {:token token :iss "access-token"}))) (tokens/verify cfg {:token token :iss "access-token"})))
(def sql:get-token-data (def sql:get-token-data
"SELECT perms, profile_id, expires_at "SELECT perms, profile_id, expires_at
@@ -43,11 +43,11 @@
(defn- wrap-soft-auth (defn- wrap-soft-auth
"Soft Authentication, will be executed synchronously on the undertow "Soft Authentication, will be executed synchronously on the undertow
worker thread." worker thread."
[handler {:keys [::setup/props]}] [handler cfg]
(letfn [(handle-request [request] (letfn [(handle-request [request]
(try (try
(let [token (get-token request) (let [token (get-token request)
claims (decode-token props token)] claims (decode-token cfg token)]
(cond-> request (cond-> request
(map? claims) (map? claims)
(assoc ::id (:tid claims)))) (assoc ::id (:tid claims))))

View File

@@ -17,11 +17,9 @@
[app.main :as-alias main] [app.main :as-alias main]
[app.setup :as-alias setup] [app.setup :as-alias setup]
[app.tokens :as tokens] [app.tokens :as tokens]
[app.worker :as-alias wrk]
[clojure.data.json :as j] [clojure.data.json :as j]
[cuerdas.core :as str] [cuerdas.core :as str]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px]
[yetti.request :as yreq] [yetti.request :as yreq]
[yetti.response :as-alias yres])) [yetti.response :as-alias yres]))
@@ -40,8 +38,8 @@
[_ cfg] [_ cfg]
(letfn [(handler [request] (letfn [(handler [request]
(let [data (-> request yreq/body slurp)] (let [data (-> request yreq/body slurp)]
(px/run! :vthread (partial handle-request cfg data))) (handle-request cfg data)
{::yres/status 200})] {::yres/status 200}))]
["/sns" {:handler handler ["/sns" {:handler handler
:allowed-methods #{:post}}])) :allowed-methods #{:post}}]))
@@ -109,7 +107,7 @@
[cfg headers] [cfg headers]
(let [tdata (get headers "x-penpot-data")] (let [tdata (get headers "x-penpot-data")]
(when-not (str/empty? tdata) (when-not (str/empty? tdata)
(let [result (tokens/verify (::setup/props cfg) {:token tdata :iss :profile-identity})] (let [result (tokens/verify cfg {:token tdata :iss :profile-identity})]
(:profile-id result))))) (:profile-id result)))))
(defn- parse-notification (defn- parse-notification

View File

@@ -27,6 +27,7 @@
[app.rpc.commands.profile :as profile] [app.rpc.commands.profile :as profile]
[app.rpc.commands.teams :as teams] [app.rpc.commands.teams :as teams]
[app.setup :as-alias setup] [app.setup :as-alias setup]
[app.setup.clock :as clock]
[app.srepl.main :as srepl] [app.srepl.main :as srepl]
[app.storage :as-alias sto] [app.storage :as-alias sto]
[app.storage.tmp :as tmp] [app.storage.tmp :as tmp]
@@ -49,11 +50,17 @@
(defn index-handler (defn index-handler
[_cfg _request] [_cfg _request]
(let [{:keys [clock offset]} @clock/current]
{::yres/status 200 {::yres/status 200
::yres/headers {"content-type" "text/html"} ::yres/headers {"content-type" "text/html"}
::yres/body (-> (io/resource "app/templates/debug.tmpl") ::yres/body (-> (io/resource "app/templates/debug.tmpl")
(tmpl/render {:version (:full cf/version) (tmpl/render {:version (:full cf/version)
:supported-features cfeat/supported-features}))}) :current-clock (str clock)
:current-offset (if offset
(ct/format-duration offset)
"NO OFFSET")
:current-time (ct/format-inst (ct/now) :http)
:supported-features cfeat/supported-features}))}))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; FILE CHANGES ;; FILE CHANGES
@@ -390,34 +397,6 @@
::yres/headers {"content-type" "text/plain"} ::yres/headers {"content-type" "text/plain"}
::yres/body (str/ffmt "PROFILE '%' ACTIVATED" (:email profile))})))))) ::yres/body (str/ffmt "PROFILE '%' ACTIVATED" (:email profile))}))))))
(defn- reset-file-version
[cfg {:keys [params] :as request}]
(let [file-id (some-> params :file-id d/parse-uuid)
version (some-> params :version d/parse-integer)]
(when-not (contains? params :force)
(ex/raise :type :validation
:code :missing-force
:hint "missing force checkbox"))
(when (nil? file-id)
(ex/raise :type :validation
:code :invalid-file-id
:hint "provided invalid file id"))
(when (nil? version)
(ex/raise :type :validation
:code :invalid-version
:hint "provided invalid version"))
(db/tx-run! cfg srepl/process-file! file-id #(assoc % :version version))
{::yres/status 200
::yres/headers {"content-type" "text/plain"}
::yres/body "OK"}))
(defn- handle-team-features (defn- handle-team-features
[cfg {:keys [params] :as request}] [cfg {:keys [params] :as request}]
(let [team-id (some-> params :team-id d/parse-uuid) (let [team-id (some-> params :team-id d/parse-uuid)
@@ -462,6 +441,24 @@
::yres/headers {"content-type" "text/plain"} ::yres/headers {"content-type" "text/plain"}
::yres/body "OK"})))) ::yres/body "OK"}))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; VIRTUAL CLOCK
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- set-virtual-clock
[_ {:keys [params] :as request}]
(let [offset (some-> params :offset str/trim not-empty ct/duration)
reset? (contains? params :reset)]
(if (= "production" (cf/get :tenant))
{::yres/status 501
::yres/body "OPERATION NOT ALLOWED"}
(do
(if (or reset? (zero? (inst-ms offset)))
(clock/set-offset! nil)
(clock/set-offset! offset))
{::yres/status 302
::yres/headers {"location" "/dbg"}}))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; OTHER SMALL VIEWS/HANDLERS ;; OTHER SMALL VIEWS/HANDLERS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
@@ -548,10 +545,10 @@
["/error/:id" {:handler (partial error-handler cfg)}] ["/error/:id" {:handler (partial error-handler cfg)}]
["/error" {:handler (partial error-list-handler cfg)}] ["/error" {:handler (partial error-list-handler cfg)}]
["/actions" {:middleware [[errors]]} ["/actions" {:middleware [[errors]]}
["/set-virtual-clock"
{:handler (partial set-virtual-clock cfg)}]
["/resend-email-verification" ["/resend-email-verification"
{:handler (partial resend-email-notification cfg)}] {:handler (partial resend-email-notification cfg)}]
["/reset-file-version"
{:handler (partial reset-file-version cfg)}]
["/handle-team-features" ["/handle-team-features"
{:handler (partial handle-team-features cfg)}] {:handler (partial handle-team-features cfg)}]
["/file-export" {:handler (partial export-handler cfg)}] ["/file-export" {:handler (partial export-handler cfg)}]

View File

@@ -11,7 +11,9 @@
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.schema.generators :as sg] [app.common.schema.generators :as sg]
[app.common.time :as ct] [app.common.time :as ct]
[app.config :as cf]
[app.db :as db] [app.db :as db]
[app.http.access-token :refer [get-token]]
[app.main :as-alias main] [app.main :as-alias main]
[app.rpc.commands.profile :as cmd.profile] [app.rpc.commands.profile :as cmd.profile]
[app.setup :as-alias setup] [app.setup :as-alias setup]
@@ -30,6 +32,20 @@
[_ params] [_ params]
(assert (db/pool? (::db/pool params)) "expect valid database pool")) (assert (db/pool? (::db/pool params)) "expect valid database pool"))
(def ^:private auth
{:name ::auth
:compile
(fn [_ _]
(fn [handler shared-key]
(if shared-key
(fn [request]
(let [token (get-token request)]
(if (= token shared-key)
(handler request)
{::yres/status 403})))
(fn [_ _]
{::yres/status 403}))))})
(def ^:private default-system (def ^:private default-system
{:name ::default-system {:name ::default-system
:compile :compile
@@ -49,7 +65,8 @@
(defmethod ig/init-key ::routes (defmethod ig/init-key ::routes
[_ cfg] [_ cfg]
["" {:middleware [[default-system cfg] ["" {:middleware [[auth (cf/get :management-api-shared-key)]
[default-system cfg]
[transaction]]} [transaction]]}
["/authenticate" ["/authenticate"
{:handler authenticate {:handler authenticate
@@ -79,8 +96,7 @@
(defn- authenticate (defn- authenticate
[cfg request] [cfg request]
(let [token (-> request :params :token) (let [token (-> request :params :token)
props (get cfg ::setup/props) result (tokens/verify cfg {:token token :iss "authentication"})]
result (tokens/verify props {:token token :iss "authentication"})]
{::yres/status 200 {::yres/status 200
::yres/body result})) ::yres/body result}))

View File

@@ -0,0 +1,55 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.http.security
"Additional security layer middlewares"
(:require
[app.config :as cf]
[yetti.request :as yreq]
[yetti.response :as yres]))
(def ^:private safe-methods
#{:get :head :options})
(defn- wrap-sec-fetch-metadata
"Sec-Fetch metadata security layer middleware"
[handler]
(fn [request]
(let [site (yreq/get-header request "sec-fetch-site")]
(cond
(= site "same-origin")
(handler request)
(or (= site "same-site")
(= site "cross-site"))
(if (contains? safe-methods (yreq/method request))
(handler request)
{::yres/status 403})
:else
(handler request)))))
(def sec-fetch-metadata
{:name ::sec-fetch-metadata
:compile (fn [_ _]
(when (contains? cf/flags :sec-fetch-metadata-middleware)
wrap-sec-fetch-metadata))})
(defn- wrap-client-header-check
"Check for a penpot custom header to be present as additional CSRF
protection"
[handler]
(fn [request]
(let [client (yreq/get-header request "x-client")]
(if (some? client)
(handler request)
{::yres/status 403}))))
(def client-header-check
{:name ::client-header-check
:compile (fn [_ _]
(when (contains? cf/flags :client-header-check-middleware)
wrap-client-header-check))})

View File

@@ -11,7 +11,6 @@
[app.common.logging :as l] [app.common.logging :as l]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct] [app.common.time :as ct]
[app.common.uri :as u]
[app.config :as cf] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.db.sql :as sql] [app.db.sql :as sql]
@@ -72,7 +71,7 @@
(sm/validator schema:params)) (sm/validator schema:params))
(defn- prepare-session-params (defn- prepare-session-params
[key params] [params key]
(assert (string? key) "expected key to be a string") (assert (string? key) "expected key to be a string")
(assert (not (str/blank? key)) "expected key to be not empty") (assert (not (str/blank? key)) "expected key to be not empty")
(assert (valid-params? params) "expected valid params") (assert (valid-params? params) "expected valid params")
@@ -90,7 +89,9 @@
(db/exec-one! pool (sql/select :http-session {:id token}))) (db/exec-one! pool (sql/select :http-session {:id token})))
(write! [_ key params] (write! [_ key params]
(let [params (prepare-session-params key params)] (let [params (-> params
(assoc :created-at (ct/now))
(prepare-session-params key))]
(db/insert! pool :http-session params) (db/insert! pool :http-session params)
params)) params))
@@ -113,7 +114,9 @@
(get @cache token)) (get @cache token))
(write! [_ key params] (write! [_ key params]
(let [params (prepare-session-params key params)] (let [params (-> params
(assoc :created-at (ct/now))
(prepare-session-params key))]
(swap! cache assoc key params) (swap! cache assoc key params)
params)) params))
@@ -144,27 +147,23 @@
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(declare ^:private assign-auth-token-cookie) (declare ^:private assign-auth-token-cookie)
(declare ^:private assign-auth-data-cookie)
(declare ^:private clear-auth-token-cookie) (declare ^:private clear-auth-token-cookie)
(declare ^:private clear-auth-data-cookie)
(declare ^:private gen-token) (declare ^:private gen-token)
(defn create-fn (defn create-fn
[{:keys [::manager ::setup/props]} profile-id] [{:keys [::manager] :as cfg} profile-id]
(assert (manager? manager) "expected valid session manager") (assert (manager? manager) "expected valid session manager")
(assert (uuid? profile-id) "expected valid uuid for profile-id") (assert (uuid? profile-id) "expected valid uuid for profile-id")
(fn [request response] (fn [request response]
(let [uagent (yreq/get-header request "user-agent") (let [uagent (yreq/get-header request "user-agent")
params {:profile-id profile-id params {:profile-id profile-id
:user-agent uagent :user-agent uagent}
:created-at (ct/now)} token (gen-token cfg params)
token (gen-token props params)
session (write! manager token params)] session (write! manager token params)]
(l/trace :hint "create" :profile-id (str profile-id)) (l/trc :hint "create" :profile-id (str profile-id))
(-> response (-> response
(assign-auth-token-cookie session) (assign-auth-token-cookie session)))))
(assign-auth-data-cookie session)))))
(defn delete-fn (defn delete-fn
[{:keys [::manager]}] [{:keys [::manager]}]
@@ -172,23 +171,22 @@
(fn [request response] (fn [request response]
(let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name) (let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name)
cookie (yreq/get-cookie request cname)] cookie (yreq/get-cookie request cname)]
(l/trace :hint "delete" :profile-id (:profile-id request)) (l/trc :hint "delete" :profile-id (:profile-id request))
(some->> (:value cookie) (delete! manager)) (some->> (:value cookie) (delete! manager))
(-> response (-> response
(assoc :status 204) (assoc :status 204)
(assoc :body nil) (assoc :body nil)
(clear-auth-token-cookie) (clear-auth-token-cookie)))))
(clear-auth-data-cookie)))))
(defn- gen-token (defn- gen-token
[props {:keys [profile-id created-at]}] [cfg {:keys [profile-id created-at]}]
(tokens/generate props {:iss "authentication" (tokens/generate cfg {:iss "authentication"
:iat created-at :iat created-at
:uid profile-id})) :uid profile-id}))
(defn- decode-token (defn- decode-token
[props token] [cfg token]
(when token (when token
(tokens/verify props {:token token :iss "authentication"}))) (tokens/verify cfg {:token token :iss "authentication"})))
(defn- get-token (defn- get-token
[request] [request]
@@ -208,18 +206,18 @@
(neg? (compare default-renewal-max-age elapsed))))) (neg? (compare default-renewal-max-age elapsed)))))
(defn- wrap-soft-auth (defn- wrap-soft-auth
[handler {:keys [::manager ::setup/props]}] [handler {:keys [::manager] :as cfg}]
(assert (manager? manager) "expected valid session manager") (assert (manager? manager) "expected valid session manager")
(letfn [(handle-request [request] (letfn [(handle-request [request]
(try (try
(let [token (get-token request) (let [token (get-token request)
claims (decode-token props token)] claims (decode-token cfg token)]
(cond-> request (cond-> request
(map? claims) (map? claims)
(-> (assoc ::token-claims claims) (-> (assoc ::token-claims claims)
(assoc ::token token)))) (assoc ::token token))))
(catch Throwable cause (catch Throwable cause
(l/trace :hint "exception on decoding malformed token" :cause cause) (l/trc :hint "exception on decoding malformed token" :cause cause)
request)))] request)))]
(fn [request] (fn [request]
@@ -239,8 +237,7 @@
(if (renew-session? session) (if (renew-session? session)
(let [session (update! manager session)] (let [session (update! manager session)]
(-> response (-> response
(assign-auth-token-cookie session) (assign-auth-token-cookie session)))
(assign-auth-data-cookie session)))
response)))) response))))
(def soft-auth (def soft-auth
@@ -256,7 +253,7 @@
(defn- assign-auth-token-cookie (defn- assign-auth-token-cookie
[response {token :id updated-at :updated-at}] [response {token :id updated-at :updated-at}]
(let [max-age (cf/get :auth-token-cookie-max-age default-cookie-max-age) (let [max-age (cf/get :auth-token-cookie-max-age default-cookie-max-age)
created-at (or updated-at (ct/now)) created-at updated-at
renewal (ct/plus created-at default-renewal-max-age) renewal (ct/plus created-at default-renewal-max-age)
expires (ct/plus created-at max-age) expires (ct/plus created-at max-age)
secure? (contains? cf/flags :secure-session-cookies) secure? (contains? cf/flags :secure-session-cookies)
@@ -273,53 +270,15 @@
:secure secure?}] :secure secure?}]
(update response :cookies assoc name cookie))) (update response :cookies assoc name cookie)))
(defn- assign-auth-data-cookie
[response {profile-id :profile-id updated-at :updated-at}]
(let [max-age (cf/get :auth-token-cookie-max-age default-cookie-max-age)
domain (cf/get :auth-data-cookie-domain)
cname default-auth-data-cookie-name
created-at (or updated-at (ct/now))
renewal (ct/plus created-at default-renewal-max-age)
expires (ct/plus created-at max-age)
comment (str "Renewal at: " (ct/format-inst renewal :rfc1123))
secure? (contains? cf/flags :secure-session-cookies)
strict? (contains? cf/flags :strict-session-cookies)
cors? (contains? cf/flags :cors)
cookie {:domain domain
:expires expires
:path "/"
:comment comment
:value (u/map->query-string {:profile-id profile-id})
:same-site (if cors? :none (if strict? :strict :lax))
:secure secure?}]
(cond-> response
(string? domain)
(update :cookies assoc cname cookie))))
(defn- clear-auth-token-cookie (defn- clear-auth-token-cookie
[response] [response]
(let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name)] (let [cname (cf/get :auth-token-cookie-name default-auth-token-cookie-name)]
(update response :cookies assoc cname {:path "/" :value "" :max-age 0}))) (update response :cookies assoc cname {:path "/" :value "" :max-age 0})))
(defn- clear-auth-data-cookie
[response]
(let [cname default-auth-data-cookie-name
domain (cf/get :auth-data-cookie-domain)]
(cond-> response
(string? domain)
(update :cookies assoc cname {:domain domain :path "/" :value "" :max-age 0}))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; TASK: SESSION GC ;; TASK: SESSION GC
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; FIXME: MOVE
(defmethod ig/assert-key ::tasks/gc (defmethod ig/assert-key ::tasks/gc
[_ params] [_ params]
(assert (db/pool? (::db/pool params)) "expected valid database pool") (assert (db/pool? (::db/pool params)) "expected valid database pool")
@@ -332,22 +291,23 @@
(def ^:private (def ^:private
sql:delete-expired sql:delete-expired
"delete from http_session "DELETE FROM http_session
where updated_at < now() - ?::interval WHERE updated_at < ?::timestamptz
or (updated_at is null and or (updated_at is null and
created_at < now() - ?::interval)") created_at < ?::timestamptz)")
(defn- collect-expired-tasks (defn- collect-expired-tasks
[{:keys [::db/conn ::tasks/max-age]}] [{:keys [::db/conn ::tasks/max-age]}]
(let [interval (db/interval max-age) (let [threshold (ct/minus (ct/now) max-age)
result (db/exec-one! conn [sql:delete-expired interval interval]) result (-> (db/exec-one! conn [sql:delete-expired threshold threshold])
result (:next.jdbc/update-count result)] (db/get-update-count))]
(l/debug :task "gc" (l/dbg :task "gc"
:hint "clean http sessions" :hint "clean http sessions"
:deleted result) :deleted result)
result)) result))
(defmethod ig/init-key ::tasks/gc (defmethod ig/init-key ::tasks/gc
[_ {:keys [::tasks/max-age] :as cfg}] [_ {:keys [::tasks/max-age] :as cfg}]
(l/debug :hint "initializing session gc task" :max-age max-age) (l/dbg :hint "initializing session gc task" :max-age max-age)
(fn [_] (db/tx-run! cfg collect-expired-tasks))) (fn [_]
(db/tx-run! cfg collect-expired-tasks)))

View File

@@ -55,7 +55,7 @@
::yres/body (yres/stream-body ::yres/body (yres/stream-body
(fn [_ output] (fn [_ output]
(let [channel (sp/chan :buf buf :xf (keep encode)) (let [channel (sp/chan :buf buf :xf (keep encode))
listener (events/start-listener listener (events/spawn-listener
channel channel
(partial write! output) (partial write! output)
(partial pu/close! output))] (partial pu/close! output))]

View File

@@ -9,7 +9,6 @@
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct]
[app.common.transit :as t] [app.common.transit :as t]
[app.common.uuid :as uuid] [app.common.uuid :as uuid]
[app.config :as cf] [app.config :as cf]
@@ -53,9 +52,8 @@
(defn- send! (defn- send!
[{:keys [::uri] :as cfg} events] [{:keys [::uri] :as cfg} events]
(let [token (tokens/generate (::setup/props cfg) (let [token (tokens/generate cfg
{:iss "authentication" {:iss "authentication"
:iat (ct/now)
:uid uuid/zero}) :uid uuid/zero})
body (t/encode {:events events}) body (t/encode {:events events})
headers {"content-type" "application/transit+json" headers {"content-type" "application/transit+json"

View File

@@ -42,6 +42,7 @@
[app.svgo :as-alias svgo] [app.svgo :as-alias svgo]
[app.util.cron] [app.util.cron]
[app.worker :as-alias wrk] [app.worker :as-alias wrk]
[app.worker.executor]
[clojure.test :as test] [clojure.test :as test]
[clojure.tools.namespace.repl :as repl] [clojure.tools.namespace.repl :as repl]
[cuerdas.core :as str] [cuerdas.core :as str]
@@ -148,23 +149,11 @@
::mdef/labels [] ::mdef/labels []
::mdef/type :histogram} ::mdef/type :histogram}
:executors-active-threads :http-server-dispatch-timing
{::mdef/name "penpot_executors_active_threads" {::mdef/name "penpot_http_server_dispatch_timing"
::mdef/help "Current number of threads available in the executor service." ::mdef/help "Histogram of dispatch handler"
::mdef/labels ["name"] ::mdef/labels []
::mdef/type :gauge} ::mdef/type :histogram}})
:executors-completed-tasks
{::mdef/name "penpot_executors_completed_tasks_total"
::mdef/help "Approximate number of completed tasks by the executor."
::mdef/labels ["name"]
::mdef/type :counter}
:executors-running-threads
{::mdef/name "penpot_executors_running_threads"
::mdef/help "Current number of threads with state RUNNING."
::mdef/labels ["name"]
::mdef/type :gauge}})
(def system-config (def system-config
{::db/pool {::db/pool
@@ -176,14 +165,12 @@
::db/max-size (cf/get :database-max-pool-size 60) ::db/max-size (cf/get :database-max-pool-size 60)
::mtx/metrics (ig/ref ::mtx/metrics)} ::mtx/metrics (ig/ref ::mtx/metrics)}
;; Default thread pool for IO operations ;; Default netty IO pool (shared between several services)
::wrk/executor ::wrk/netty-io-executor
{} {:threads (cf/get :netty-io-threads)}
::wrk/monitor ::wrk/netty-executor
{::mtx/metrics (ig/ref ::mtx/metrics) {:threads (cf/get :executor-threads)}
::wrk/executor (ig/ref ::wrk/executor)
::wrk/name "default"}
:app.migrations/migrations :app.migrations/migrations
{::db/pool (ig/ref ::db/pool)} {::db/pool (ig/ref ::db/pool)}
@@ -194,17 +181,27 @@
::mtx/routes ::mtx/routes
{::mtx/metrics (ig/ref ::mtx/metrics)} {::mtx/metrics (ig/ref ::mtx/metrics)}
::rds/redis ::rds/client
{::rds/uri (cf/get :redis-uri) {::rds/uri
::mtx/metrics (ig/ref ::mtx/metrics) (cf/get :redis-uri)
::wrk/executor (ig/ref ::wrk/executor)}
::wrk/netty-executor
(ig/ref ::wrk/netty-executor)
::wrk/netty-io-executor
(ig/ref ::wrk/netty-io-executor)}
::rds/pool
{::rds/client (ig/ref ::rds/client)
::mtx/metrics (ig/ref ::mtx/metrics)}
::mbus/msgbus ::mbus/msgbus
{::wrk/executor (ig/ref ::wrk/executor) {::wrk/executor (ig/ref ::wrk/netty-executor)
::rds/redis (ig/ref ::rds/redis)} ::rds/client (ig/ref ::rds/client)
::mtx/metrics (ig/ref ::mtx/metrics)}
:app.storage.tmp/cleaner :app.storage.tmp/cleaner
{::wrk/executor (ig/ref ::wrk/executor)} {::wrk/executor (ig/ref ::wrk/netty-executor)}
::sto.gc-deleted/handler ::sto.gc-deleted/handler
{::db/pool (ig/ref ::db/pool) {::db/pool (ig/ref ::db/pool)
@@ -232,9 +229,10 @@
::http/host (cf/get :http-server-host) ::http/host (cf/get :http-server-host)
::http/router (ig/ref ::http/router) ::http/router (ig/ref ::http/router)
::http/io-threads (cf/get :http-server-io-threads) ::http/io-threads (cf/get :http-server-io-threads)
::http/max-worker-threads (cf/get :http-server-max-worker-threads)
::http/max-body-size (cf/get :http-server-max-body-size) ::http/max-body-size (cf/get :http-server-max-body-size)
::http/max-multipart-body-size (cf/get :http-server-max-multipart-body-size) ::http/max-multipart-body-size (cf/get :http-server-max-multipart-body-size)
::wrk/executor (ig/ref ::wrk/executor)} ::mtx/metrics (ig/ref ::mtx/metrics)}
::ldap/provider ::ldap/provider
{:host (cf/get :ldap-host) {:host (cf/get :ldap-host)
@@ -312,23 +310,24 @@
::rpc/climit ::rpc/climit
{::mtx/metrics (ig/ref ::mtx/metrics) {::mtx/metrics (ig/ref ::mtx/metrics)
::wrk/executor (ig/ref ::wrk/executor) ::wrk/executor (ig/ref ::wrk/netty-executor)
::climit/config (cf/get :rpc-climit-config) ::climit/config (cf/get :rpc-climit-config)
::climit/enabled (contains? cf/flags :rpc-climit)} ::climit/enabled (contains? cf/flags :rpc-climit)}
:app.rpc/rlimit :app.rpc/rlimit
{::wrk/executor (ig/ref ::wrk/executor)} {::wrk/executor (ig/ref ::wrk/netty-executor)}
:app.rpc/methods :app.rpc/methods
{::http.client/client (ig/ref ::http.client/client) {::http.client/client (ig/ref ::http.client/client)
::db/pool (ig/ref ::db/pool) ::db/pool (ig/ref ::db/pool)
::wrk/executor (ig/ref ::wrk/executor) ::rds/pool (ig/ref ::rds/pool)
::wrk/executor (ig/ref ::wrk/netty-executor)
::session/manager (ig/ref ::session/manager) ::session/manager (ig/ref ::session/manager)
::ldap/provider (ig/ref ::ldap/provider) ::ldap/provider (ig/ref ::ldap/provider)
::sto/storage (ig/ref ::sto/storage) ::sto/storage (ig/ref ::sto/storage)
::mtx/metrics (ig/ref ::mtx/metrics) ::mtx/metrics (ig/ref ::mtx/metrics)
::mbus/msgbus (ig/ref ::mbus/msgbus) ::mbus/msgbus (ig/ref ::mbus/msgbus)
::rds/redis (ig/ref ::rds/redis) ::rds/client (ig/ref ::rds/client)
::rpc/climit (ig/ref ::rpc/climit) ::rpc/climit (ig/ref ::rpc/climit)
::rpc/rlimit (ig/ref ::rpc/rlimit) ::rpc/rlimit (ig/ref ::rpc/rlimit)
@@ -341,7 +340,7 @@
:app.rpc.doc/routes :app.rpc.doc/routes
{:app.rpc/methods (ig/ref :app.rpc/methods)} {:app.rpc/methods (ig/ref :app.rpc/methods)}
:app.rpc/routes ::rpc/routes
{::rpc/methods (ig/ref :app.rpc/methods) {::rpc/methods (ig/ref :app.rpc/methods)
::db/pool (ig/ref ::db/pool) ::db/pool (ig/ref ::db/pool)
::session/manager (ig/ref ::session/manager) ::session/manager (ig/ref ::session/manager)
@@ -433,6 +432,9 @@
;; module requires the migrations to run before initialize. ;; module requires the migrations to run before initialize.
::migrations (ig/ref :app.migrations/migrations)} ::migrations (ig/ref :app.migrations/migrations)}
::setup/clock
{}
:app.loggers.audit.archive-task/handler :app.loggers.audit.archive-task/handler
{::setup/props (ig/ref ::setup/props) {::setup/props (ig/ref ::setup/props)
::db/pool (ig/ref ::db/pool) ::db/pool (ig/ref ::db/pool)
@@ -476,13 +478,14 @@
(cf/get :objects-storage-s3-bucket)) (cf/get :objects-storage-s3-bucket))
::sto.s3/io-threads (or (cf/get :storage-assets-s3-io-threads) ::sto.s3/io-threads (or (cf/get :storage-assets-s3-io-threads)
(cf/get :objects-storage-s3-io-threads)) (cf/get :objects-storage-s3-io-threads))
::wrk/executor (ig/ref ::wrk/executor)}
::wrk/netty-io-executor
(ig/ref ::wrk/netty-io-executor)}
:app.storage.fs/backend :app.storage.fs/backend
{::sto.fs/directory (or (cf/get :storage-assets-fs-directory) {::sto.fs/directory (or (cf/get :storage-assets-fs-directory)
(cf/get :objects-storage-fs-directory))}}) (cf/get :objects-storage-fs-directory))}})
(def worker-config (def worker-config
{::wrk/cron {::wrk/cron
{::wrk/registry (ig/ref ::wrk/registry) {::wrk/registry (ig/ref ::wrk/registry)
@@ -518,7 +521,7 @@
:task :audit-log-gc})]} :task :audit-log-gc})]}
::wrk/dispatcher ::wrk/dispatcher
{::rds/redis (ig/ref ::rds/redis) {::rds/client (ig/ref ::rds/client)
::mtx/metrics (ig/ref ::mtx/metrics) ::mtx/metrics (ig/ref ::mtx/metrics)
::db/pool (ig/ref ::db/pool) ::db/pool (ig/ref ::db/pool)
::wrk/tenant (cf/get :tenant)} ::wrk/tenant (cf/get :tenant)}
@@ -527,7 +530,7 @@
{::wrk/parallelism (cf/get ::worker-default-parallelism 1) {::wrk/parallelism (cf/get ::worker-default-parallelism 1)
::wrk/queue :default ::wrk/queue :default
::wrk/tenant (cf/get :tenant) ::wrk/tenant (cf/get :tenant)
::rds/redis (ig/ref ::rds/redis) ::rds/client (ig/ref ::rds/client)
::wrk/registry (ig/ref ::wrk/registry) ::wrk/registry (ig/ref ::wrk/registry)
::mtx/metrics (ig/ref ::mtx/metrics) ::mtx/metrics (ig/ref ::mtx/metrics)
::db/pool (ig/ref ::db/pool)} ::db/pool (ig/ref ::db/pool)}
@@ -536,7 +539,7 @@
{::wrk/parallelism (cf/get ::worker-webhook-parallelism 1) {::wrk/parallelism (cf/get ::worker-webhook-parallelism 1)
::wrk/queue :webhooks ::wrk/queue :webhooks
::wrk/tenant (cf/get :tenant) ::wrk/tenant (cf/get :tenant)
::rds/redis (ig/ref ::rds/redis) ::rds/client (ig/ref ::rds/client)
::wrk/registry (ig/ref ::wrk/registry) ::wrk/registry (ig/ref ::wrk/registry)
::mtx/metrics (ig/ref ::mtx/metrics) ::mtx/metrics (ig/ref ::mtx/metrics)
::db/pool (ig/ref ::db/pool)}}) ::db/pool (ig/ref ::db/pool)}})

View File

@@ -447,7 +447,10 @@
:fn (mg/resource "app/migrations/sql/0140-add-locked-by-column-to-file-change-table.sql")} :fn (mg/resource "app/migrations/sql/0140-add-locked-by-column-to-file-change-table.sql")}
{:name "0141-add-idx-to-file-library-rel" {:name "0141-add-idx-to-file-library-rel"
:fn (mg/resource "app/migrations/sql/0141-add-idx-to-file-library-rel.sql")}]) :fn (mg/resource "app/migrations/sql/0141-add-idx-to-file-library-rel.sql")}
{:name "0141-add-file-data-table.sql"
:fn (mg/resource "app/migrations/sql/0141-add-file-data-table.sql")}])
(defn apply-migrations! (defn apply-migrations!
[pool name migrations] [pool name migrations]

View File

@@ -10,8 +10,8 @@
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.pprint] [app.common.pprint]
[app.srepl.fixes.media-refs :refer [process-file]]
[app.srepl.main :as srepl] [app.srepl.main :as srepl]
[app.srepl.procs.media-refs]
[clojure.edn :as edn])) [clojure.edn :as edn]))
(def ^:private required-services (def ^:private required-services
@@ -20,7 +20,10 @@
:app.storage/storage :app.storage/storage
:app.metrics/metrics :app.metrics/metrics
:app.db/pool :app.db/pool
:app.worker/executor]) :app.worker/netty-io-executor])
(def default-options
{:rollback? false})
(defn -main (defn -main
[& [options]] [& [options]]
@@ -28,22 +31,20 @@
(let [config-var (requiring-resolve 'app.main/system-config) (let [config-var (requiring-resolve 'app.main/system-config)
start-var (requiring-resolve 'app.main/start-custom) start-var (requiring-resolve 'app.main/start-custom)
stop-var (requiring-resolve 'app.main/stop) stop-var (requiring-resolve 'app.main/stop)
config (select-keys @config-var required-services)]
config (select-keys @config-var required-services)
options (if (string? options)
(ex/ignoring (edn/read-string options))
{})
options (-> (merge default-options options)
(assoc :proc-fn #'app.srepl.procs.media-refs/fix-media-refs))]
(start-var config) (start-var config)
(let [options (if (string? options)
(ex/ignoring (edn/read-string options))
{})]
(l/inf :hint "executing media-refs migration" :options options) (l/inf :hint "executing media-refs migration" :options options)
(srepl/process-files! process-file options)) (srepl/process! options)
(stop-var) (stop-var)
(System/exit 0)) (System/exit 0))
(catch Throwable cause (catch Throwable cause
(ex/print-throwable cause) (ex/print-throwable cause)
(flush) (flush)
(System/exit -1)))) (System/exit -1))))

View File

@@ -0,0 +1,38 @@
CREATE TABLE file_data (
file_id uuid NOT NULL REFERENCES file(id) DEFERRABLE,
id uuid NOT NULL,
created_at timestamptz NOT NULL DEFAULT now(),
modified_at timestamptz NOT NULL DEFAULT now(),
deleted_at timestamptz NULL,
type text NOT NULL,
backend text NULL,
metadata jsonb NULL,
data bytea NULL,
PRIMARY KEY (file_id, id)
) PARTITION BY HASH (file_id);
CREATE TABLE file_data_00 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 0);
CREATE TABLE file_data_01 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 1);
CREATE TABLE file_data_02 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 2);
CREATE TABLE file_data_03 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 3);
CREATE TABLE file_data_04 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 4);
CREATE TABLE file_data_05 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 5);
CREATE TABLE file_data_06 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 6);
CREATE TABLE file_data_07 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 7);
CREATE TABLE file_data_08 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 8);
CREATE TABLE file_data_09 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 9);
CREATE TABLE file_data_10 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 10);
CREATE TABLE file_data_11 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 11);
CREATE TABLE file_data_12 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 12);
CREATE TABLE file_data_13 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 13);
CREATE TABLE file_data_14 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 14);
CREATE TABLE file_data_15 PARTITION OF file_data FOR VALUES WITH (MODULUS 16, REMAINDER 15);
CREATE INDEX file_data__deleted_at__idx
ON file_data (deleted_at, file_id, id)
WHERE deleted_at IS NOT NULL;

View File

@@ -16,7 +16,6 @@
[app.redis :as rds] [app.redis :as rds]
[app.worker :as wrk] [app.worker :as wrk]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.core :as p]
[promesa.exec :as px] [promesa.exec :as px]
[promesa.exec.csp :as sp])) [promesa.exec.csp :as sp]))
@@ -59,14 +58,16 @@
(assoc ::timeout (ct/duration {:seconds 30})))}) (assoc ::timeout (ct/duration {:seconds 30})))})
(def ^:private schema:params (def ^:private schema:params
[:map ::rds/redis ::wrk/executor]) [:map
::rds/client
::wrk/executor])
(defmethod ig/assert-key ::msgbus (defmethod ig/assert-key ::msgbus
[_ params] [_ params]
(assert (sm/check schema:params params))) (assert (sm/check schema:params params)))
(defmethod ig/init-key ::msgbus (defmethod ig/init-key ::msgbus
[_ {:keys [::buffer-size ::wrk/executor ::timeout ::rds/redis] :as cfg}] [_ {:keys [::buffer-size ::wrk/executor ::timeout] :as cfg}]
(l/info :hint "initialize msgbus" :buffer-size buffer-size) (l/info :hint "initialize msgbus" :buffer-size buffer-size)
(let [cmd-ch (sp/chan :buf buffer-size) (let [cmd-ch (sp/chan :buf buffer-size)
rcv-ch (sp/chan :buf (sp/dropping-buffer buffer-size)) rcv-ch (sp/chan :buf (sp/dropping-buffer buffer-size))
@@ -74,8 +75,9 @@
:xf xform-prefix-topic) :xf xform-prefix-topic)
state (agent {}) state (agent {})
pconn (rds/connect redis :type :default :timeout timeout) ;; Open persistent connections to redis
sconn (rds/connect redis :type :pubsub :timeout timeout) pconn (rds/connect cfg :timeout timeout)
sconn (rds/connect-pubsub cfg :timeout timeout)
_ (set-error-handler! state #(l/error :cause % :hint "unexpected error on agent" ::l/sync? true)) _ (set-error-handler! state #(l/error :cause % :hint "unexpected error on agent" ::l/sync? true))
_ (set-error-mode! state :continue) _ (set-error-mode! state :continue)
@@ -189,14 +191,13 @@
(defn- create-listener (defn- create-listener
[rcv-ch] [rcv-ch]
(rds/pubsub-listener {:on-message (fn [_ topic message]
:on-message (fn [_ topic message]
;; There are no back pressure, so we use a slidding ;; There are no back pressure, so we use a slidding
;; buffer for cases when the pubsub broker sends ;; buffer for cases when the pubsub broker sends
;; more messages that we can process. ;; more messages that we can process.
(let [val {:topic topic :message (t/decode message)}] (let [val {:topic topic :message (t/decode-str message)}]
(when-not (sp/offer! rcv-ch val) (when-not (sp/offer! rcv-ch val)
(l/warn :msg "dropping message on subscription loop")))))) (l/warn :msg "dropping message on subscription loop"))))})
(defn- process-input (defn- process-input
[{:keys [::state ::wrk/executor] :as cfg} topic message] [{:keys [::state ::wrk/executor] :as cfg} topic message]
@@ -216,8 +217,7 @@
(rds/add-listener sconn (create-listener rcv-ch)) (rds/add-listener sconn (create-listener rcv-ch))
(px/thread (px/thread
{:name "penpot/msgbus/io-loop" {:name "penpot/msgbus"}
:virtual true}
(try (try
(loop [] (loop []
(let [timeout-ch (sp/timeout-chan 1000) (let [timeout-ch (sp/timeout-chan 1000)
@@ -263,7 +263,7 @@
intended to be used in core.async go blocks." intended to be used in core.async go blocks."
[{:keys [::pconn] :as cfg} {:keys [topic message]}] [{:keys [::pconn] :as cfg} {:keys [topic message]}]
(try (try
(p/await! (rds/publish pconn topic (t/encode message))) (rds/publish pconn topic (t/encode-str message))
(catch InterruptedException cause (catch InterruptedException cause
(throw cause)) (throw cause))
(catch Throwable cause (catch Throwable cause

View File

@@ -6,23 +6,22 @@
(ns app.redis (ns app.redis
"The msgbus abstraction implemented using redis as underlying backend." "The msgbus abstraction implemented using redis as underlying backend."
(:refer-clojure :exclude [eval]) (:refer-clojure :exclude [eval get set run!])
(:require (:require
[app.common.data :as d] [app.common.data :as d]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.generic-pool :as gpool]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct] [app.common.time :as ct]
[app.metrics :as mtx] [app.metrics :as mtx]
[app.redis.script :as-alias rscript] [app.redis.script :as-alias rscript]
[app.util.cache :as cache] [app.worker :as wrk]
[app.worker :as-alias wrk] [app.worker.executor]
[clojure.core :as c] [clojure.core :as c]
[clojure.java.io :as io] [clojure.java.io :as io]
[cuerdas.core :as str] [cuerdas.core :as str]
[integrant.core :as ig] [integrant.core :as ig])
[promesa.core :as p]
[promesa.exec :as px])
(:import (:import
clojure.lang.MapEntry clojure.lang.MapEntry
io.lettuce.core.KeyValue io.lettuce.core.KeyValue
@@ -32,12 +31,10 @@
io.lettuce.core.RedisException io.lettuce.core.RedisException
io.lettuce.core.RedisURI io.lettuce.core.RedisURI
io.lettuce.core.ScriptOutputType io.lettuce.core.ScriptOutputType
io.lettuce.core.api.StatefulConnection io.lettuce.core.SetArgs
io.lettuce.core.api.StatefulRedisConnection io.lettuce.core.api.StatefulRedisConnection
io.lettuce.core.api.async.RedisAsyncCommands
io.lettuce.core.api.async.RedisScriptingAsyncCommands
io.lettuce.core.api.sync.RedisCommands io.lettuce.core.api.sync.RedisCommands
io.lettuce.core.codec.ByteArrayCodec io.lettuce.core.api.sync.RedisScriptingCommands
io.lettuce.core.codec.RedisCodec io.lettuce.core.codec.RedisCodec
io.lettuce.core.codec.StringCodec io.lettuce.core.codec.StringCodec
io.lettuce.core.pubsub.RedisPubSubListener io.lettuce.core.pubsub.RedisPubSubListener
@@ -45,220 +42,158 @@
io.lettuce.core.pubsub.api.sync.RedisPubSubCommands io.lettuce.core.pubsub.api.sync.RedisPubSubCommands
io.lettuce.core.resource.ClientResources io.lettuce.core.resource.ClientResources
io.lettuce.core.resource.DefaultClientResources io.lettuce.core.resource.DefaultClientResources
io.netty.channel.nio.NioEventLoopGroup
io.netty.util.HashedWheelTimer io.netty.util.HashedWheelTimer
io.netty.util.Timer io.netty.util.Timer
io.netty.util.concurrent.EventExecutorGroup
java.lang.AutoCloseable java.lang.AutoCloseable
java.time.Duration)) java.time.Duration))
(set! *warn-on-reflection* true) (set! *warn-on-reflection* true)
(declare ^:private initialize-resources) (def ^:const MAX-EVAL-RETRIES 18)
(declare ^:private shutdown-resources)
(declare ^:private impl-eval)
(defprotocol IRedis (def default-timeout
(-connect [_ options]) (ct/duration "10s"))
(-get-or-connect [_ key options]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; IMPL & PRIVATE API
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defprotocol IConnection (defprotocol IConnection
(publish [_ topic message]) (-set-timeout [_ timeout] "set connection timeout")
(rpush [_ key payload]) (-get-timeout [_] "get current timeout")
(blpop [_ timeout keys]) (-reset-timeout [_] "reset to default timeout"))
(eval [_ script]))
(defprotocol IDefaultConnection
"Public API of default redis connection"
(-publish [_ topic message])
(-rpush [_ key payload])
(-blpop [_ timeout keys])
(-eval [_ script])
(-get [_ key])
(-set [_ key val args])
(-del [_ key-or-keys])
(-ping [_]))
(defprotocol IPubSubConnection (defprotocol IPubSubConnection
(add-listener [_ listener]) (-add-listener [_ listener])
(subscribe [_ topics]) (-subscribe [_ topics])
(unsubscribe [_ topics])) (-unsubscribe [_ topics]))
(def default-codec (def ^:private default-codec
(RedisCodec/of StringCodec/UTF8 ByteArrayCodec/INSTANCE))
(def string-codec
(RedisCodec/of StringCodec/UTF8 StringCodec/UTF8)) (RedisCodec/of StringCodec/UTF8 StringCodec/UTF8))
(sm/register! (defn- impl-eval
{:type ::connection [cmd cache metrics script]
:pred #(satisfies? IConnection %) (let [keys (into-array String (map str (::rscript/keys script)))
:type-properties vals (into-array String (map str (::rscript/vals script)))
{:title "connection" sname (::rscript/name script)
:description "redis connection instance"}})
(sm/register! read-script
{:type ::pubsub-connection (fn []
:pred #(satisfies? IPubSubConnection %) (-> script ::rscript/path io/resource slurp))
:type-properties
{:title "connection"
:description "redis connection instance"}})
(defn redis? load-script
[o] (fn []
(satisfies? IRedis o)) (let [id (.scriptLoad ^RedisScriptingCommands cmd
^String (read-script))]
(swap! cache assoc sname id)
(l/trc :hint "load script" :name sname :id id)
(sm/register! id))
{:type ::redis
:pred redis?})
(def ^:private schema:script eval-script
[:map {:title "script"} (fn [id]
[::rscript/name qualified-keyword?]
[::rscript/path ::sm/text]
[::rscript/keys {:optional true} [:vector :any]]
[::rscript/vals {:optional true} [:vector :any]]])
(def valid-script?
(sm/lazy-validator schema:script))
(defmethod ig/expand-key ::redis
[k v]
(let [cpus (px/get-available-processors)
threads (max 1 (int (* cpus 0.2)))]
{k (-> (d/without-nils v)
(assoc ::timeout (ct/duration "10s"))
(assoc ::io-threads (max 3 threads))
(assoc ::worker-threads (max 3 threads)))}))
(def ^:private schema:redis-params
[:map {:title "redis-params"}
::wrk/executor
::mtx/metrics
[::uri ::sm/uri]
[::worker-threads ::sm/int]
[::io-threads ::sm/int]
[::timeout ::ct/duration]])
(defmethod ig/assert-key ::redis
[_ params]
(assert (sm/check schema:redis-params params)))
(defmethod ig/init-key ::redis
[_ params]
(initialize-resources params))
(defmethod ig/halt-key! ::redis
[_ instance]
(d/close! instance))
(defn- initialize-resources
"Initialize redis connection resources"
[{:keys [::uri ::io-threads ::worker-threads ::wrk/executor ::mtx/metrics] :as params}]
(l/inf :hint "initialize redis resources"
:uri (str uri)
:io-threads io-threads
:worker-threads worker-threads)
(let [timer (HashedWheelTimer.)
resources (.. (DefaultClientResources/builder)
(ioThreadPoolSize ^long io-threads)
(computationThreadPoolSize ^long worker-threads)
(timer ^Timer timer)
(build))
redis-uri (RedisURI/create ^String (str uri))
shutdown (fn [client conn]
(ex/ignoring (.close ^StatefulConnection conn))
(ex/ignoring (.close ^RedisClient client))
(l/trc :hint "disconnect" :hid (hash client)))
on-remove (fn [key val cause]
(l/trace :hint "evict connection (cache)" :key key :reason cause)
(some-> val d/close!))
cache (cache/create :executor executor
:on-remove on-remove
:keepalive "5m")]
(reify
java.lang.AutoCloseable
(close [_]
(ex/ignoring (cache/invalidate! cache))
(ex/ignoring (.shutdown ^ClientResources resources))
(ex/ignoring (.stop ^Timer timer)))
IRedis
(-get-or-connect [this key options]
(let [create (fn [_] (-connect this options))]
(cache/get cache key create)))
(-connect [_ options]
(let [timeout (or (:timeout options) (::timeout params))
codec (get options :codec default-codec)
type (get options :type :default)
client (RedisClient/create ^ClientResources resources
^RedisURI redis-uri)]
(l/trc :hint "connect" :hid (hash client))
(if (= type :pubsub)
(let [conn (.connectPubSub ^RedisClient client
^RedisCodec codec)]
(.setTimeout ^StatefulConnection conn
^Duration timeout)
(reify
IPubSubConnection
(add-listener [_ listener]
(assert (instance? RedisPubSubListener listener) "expected listener instance")
(.addListener ^StatefulRedisPubSubConnection conn
^RedisPubSubListener listener))
(subscribe [_ topics]
(try (try
(let [topics (into-array String (map str topics)) (let [tpoint (ct/tpoint)
cmd (.sync ^StatefulRedisPubSubConnection conn)] result (.evalsha ^RedisScriptingCommands cmd
(.subscribe ^RedisPubSubCommands cmd topics)) ^String id
(catch RedisCommandInterruptedException cause ^ScriptOutputType ScriptOutputType/MULTI
(throw (InterruptedException. (ex-message cause)))))) ^"[Ljava.lang.String;" keys
^"[Ljava.lang.String;" vals)
elapsed (tpoint)]
(unsubscribe [_ topics] (mtx/run! metrics {:id :redis-eval-timing
(try :labels [(name sname)]
(let [topics (into-array String (map str topics)) :val (inst-ms elapsed)})
cmd (.sync ^StatefulRedisPubSubConnection conn)]
(.unsubscribe ^RedisPubSubCommands cmd topics))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause))))))
(l/trc :hint "eval script"
:name (name sname)
:id id
:params (str/join "," (::rscript/vals script))
:elapsed (ct/format-duration elapsed))
result)
(catch io.lettuce.core.RedisNoScriptException _cause
::load)
(catch Throwable cause
(when-let [on-error (::rscript/on-error script)]
(on-error cause))
(throw cause))))
eval-script'
(fn [id]
(loop [id id
retries 0]
(if (> retries MAX-EVAL-RETRIES)
(ex/raise :type :internal
:code ::max-eval-retries-reached
:hint (str "unable to eval redis script " sname))
(let [result (eval-script id)]
(if (= result ::load)
(recur (load-script)
(inc retries))
result)))))]
(if-let [id (c/get @cache sname)]
(eval-script' id)
(-> (load-script)
(eval-script')))))
(deftype Connection [^StatefulRedisConnection conn
^RedisCommands cmd
^Duration timeout
cache metrics]
AutoCloseable AutoCloseable
(close [_] (shutdown client conn)))) (close [_]
(ex/ignoring (.close conn)))
(let [conn (.connect ^RedisClient client ^RedisCodec codec)]
(.setTimeout ^StatefulConnection conn ^Duration timeout)
(reify
IConnection IConnection
(publish [_ topic message] (-set-timeout [_ timeout]
(assert (string? topic) "expected topic to be string") (.setTimeout conn ^Duration timeout))
(assert (bytes? message) "expected message to be a byte array")
(let [pcomm (.async ^StatefulRedisConnection conn)] (-reset-timeout [_]
(.publish ^RedisAsyncCommands pcomm ^String topic ^bytes message))) (.setTimeout conn timeout))
(rpush [_ key payload] (-get-timeout [_]
(assert (or (and (vector? payload) (.getTimeout conn))
(every? bytes? payload))
(bytes? payload))) IDefaultConnection
(-publish [_ topic message]
(.publish cmd ^String topic ^String message))
(-rpush [_ key elements]
(try (try
(let [cmd (.sync ^StatefulRedisConnection conn) (let [vals (make-array String (count elements))]
data (if (vector? payload) payload [payload]) (loop [i 0 xs (seq elements)]
vals (make-array (. Class (forName "[B")) (count data))]
(loop [i 0 xs (seq data)]
(when xs (when xs
(aset ^"[[B" vals i ^bytes (first xs)) (aset ^"[[Ljava.lang.String;" vals i ^String (first xs))
(recur (inc i) (next xs)))) (recur (inc i) (next xs))))
(.rpush ^RedisCommands cmd (.rpush cmd
^String key ^String key
^"[[B" vals)) ^"[[Ljava.lang.String;" vals))
(catch RedisCommandInterruptedException cause (catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause)))))) (throw (InterruptedException. (ex-message cause))))))
(blpop [_ timeout keys] (-blpop [_ keys timeout]
(try (try
(let [keys (into-array Object (map str keys)) (let [keys (into-array String keys)]
cmd (.sync ^StatefulRedisConnection conn) (when-let [res (.blpop cmd
timeout (/ (double (inst-ms timeout)) 1000.0)]
(when-let [res (.blpop ^RedisCommands cmd
^double timeout ^double timeout
^"[Ljava.lang.String;" keys)] ^"[Ljava.lang.String;" keys)]
(MapEntry/create (MapEntry/create
@@ -267,22 +202,78 @@
(catch RedisCommandInterruptedException cause (catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause)))))) (throw (InterruptedException. (ex-message cause))))))
(eval [_ script] (-get [_ key]
(assert (valid-script? script) "expected valid script") (assert (string? key) "key expected to be string")
(impl-eval conn metrics script)) (.get cmd ^String key))
(-set [_ key val args]
(.set cmd
^String key
^bytes val
^SetArgs args))
(-del [_ keys]
(let [keys (into-array String keys)]
(.del cmd ^String/1 keys)))
(-ping [_]
(.ping cmd))
(-eval [_ script]
(impl-eval cmd cache metrics script)))
(deftype SubscriptionConnection [^StatefulRedisPubSubConnection conn
^RedisPubSubCommands cmd
^Duration timeout]
AutoCloseable AutoCloseable
(close [_] (shutdown client conn)))))))))) (close [_]
(ex/ignoring (.close conn)))
(defn connect IConnection
[instance & {:as opts}] (-set-timeout [_ timeout]
(assert (satisfies? IRedis instance) "expected valid redis instance") (.setTimeout conn ^Duration timeout))
(-connect instance opts))
(defn get-or-connect (-reset-timeout [_]
[instance key & {:as opts}] (.setTimeout conn timeout))
(assert (satisfies? IRedis instance) "expected valid redis instance")
(-get-or-connect instance key opts)) (-get-timeout [_]
(.getTimeout conn))
IPubSubConnection
(-add-listener [_ listener]
(.addListener conn ^RedisPubSubListener listener))
(-subscribe [_ topics]
(try
(let [topics (into-array String topics)]
(.subscribe cmd topics))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause))))))
(-unsubscribe [_ topics]
(try
(let [topics (into-array String topics)]
(.unsubscribe cmd topics))
(catch RedisCommandInterruptedException cause
(throw (InterruptedException. (ex-message cause)))))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PUBLIC API
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn build-set-args
[options]
(reduce-kv (fn [^SetArgs args k v]
(case k
:ex (if (instance? Duration v)
(.ex args ^Duration v)
(.ex args (long v)))
:px (.px args (long v))
:nx (if v (.nx args) args)
:keep-ttl (if v (.keepttl args) args)))
(SetArgs.)
options))
(defn pubsub-listener (defn pubsub-listener
[& {:keys [on-message on-subscribe on-unsubscribe]}] [& {:keys [on-message on-subscribe on-unsubscribe]}]
@@ -311,61 +302,172 @@
(when on-unsubscribe (when on-unsubscribe
(on-unsubscribe nil topic count))))) (on-unsubscribe nil topic count)))))
(def ^:private scripts-cache (atom {})) (defn connect
[cfg & {:as options}]
(assert (contains? cfg ::mtx/metrics) "missing ::mtx/metrics on provided system")
(assert (contains? cfg ::client) "missing ::rds/client on provided system")
(defn- impl-eval (let [state (::client cfg)
[^StatefulRedisConnection connection metrics script]
(let [cmd (.async ^StatefulRedisConnection connection)
keys (into-array String (map str (::rscript/keys script)))
vals (into-array String (map str (::rscript/vals script)))
sname (::rscript/name script)]
(letfn [(on-error [cause] cache (::cache state)
(if (instance? io.lettuce.core.RedisNoScriptException cause) client (::client state)
(do timeout (or (some-> (:timeout options) ct/duration)
(l/error :hint "no script found" :name sname :cause cause) (::timeout state))
(->> (load-script)
(p/mcat eval-script)))
(if-let [on-error (::rscript/on-error script)]
(on-error cause)
(p/rejected cause))))
(eval-script [sha] conn (.connect ^RedisClient client
(let [tpoint (ct/tpoint)] ^RedisCodec default-codec)
(->> (.evalsha ^RedisScriptingAsyncCommands cmd cmd (.sync ^StatefulRedisConnection conn)]
^String sha
^ScriptOutputType ScriptOutputType/MULTI
^"[Ljava.lang.String;" keys
^"[Ljava.lang.String;" vals)
(p/fmap (fn [result]
(let [elapsed (tpoint)]
(mtx/run! metrics {:id :redis-eval-timing
:labels [(name sname)]
:val (inst-ms elapsed)})
(l/trace :hint "eval script"
:name (name sname)
:sha sha
:params (str/join "," (::rscript/vals script))
:elapsed (ct/format-duration elapsed))
result)))
(p/merr on-error))))
(read-script [] (.setTimeout ^StatefulRedisConnection conn ^Duration timeout)
(-> script ::rscript/path io/resource slurp)) (->Connection conn cmd timeout cache (::mtx/metrics cfg))))
(load-script [] (defn connect-pubsub
(l/trace :hint "load script" :name sname) [cfg & {:as options}]
(->> (.scriptLoad ^RedisScriptingAsyncCommands cmd (let [state (::client cfg)
^String (read-script)) client (::client state)
(p/fmap (fn [sha]
(swap! scripts-cache assoc sname sha)
sha))))]
(p/await! timeout (or (some-> (:timeout options) ct/duration)
(if-let [sha (get @scripts-cache sname)] (::timeout state))
(eval-script sha) conn (.connectPubSub ^RedisClient client
(->> (load-script) ^RedisCodec default-codec)
(p/mapcat eval-script))))))) cmd (.sync ^StatefulRedisPubSubConnection conn)]
(.setTimeout ^StatefulRedisPubSubConnection conn
^Duration timeout)
(->SubscriptionConnection conn cmd timeout)))
(defn get
[conn key]
(assert (string? key) "key must be string instance")
(try
(-get conn key)
(catch RedisCommandTimeoutException cause
(l/err :hint "timeout on get redis key" :key key :cause cause)
nil)))
(defn set
([conn key val]
(set conn key val nil))
([conn key val args]
(assert (string? key) "key must be string instance")
(assert (string? val) "val must be string instance")
(let [args (cond
(or (instance? SetArgs args)
(nil? args))
args
(map? args)
(build-set-args args)
:else
(throw (IllegalArgumentException. "invalid args")))]
(try
(-set conn key val args)
(catch RedisCommandTimeoutException cause
(l/err :hint "timeout on set redis key" :key key :cause cause)
nil)))))
(defn del
[conn key-or-keys]
(let [keys (if (vector? key-or-keys) key-or-keys [key-or-keys])]
(assert (every? string? keys) "only string keys allowed")
(try
(-del conn keys)
(catch RedisCommandTimeoutException cause
(l/err :hint "timeout on del redis key" :key key :cause cause)
nil))))
(defn ping
[conn]
(-ping conn))
(defn blpop
[conn key-or-keys timeout]
(let [keys (if (vector? key-or-keys) key-or-keys [key-or-keys])
timeout (cond
(ct/duration? timeout)
(/ (double (inst-ms timeout)) 1000.0)
(double? timeout)
timeout
(int? timeout)
(/ (double timeout) 1000.0)
:else
0)]
(assert (every? string? keys) "only string keys allowed")
(-blpop conn keys timeout)))
(defn rpush
[conn key elements]
(assert (string? key) "key must be string instance")
(assert (every? string? elements) "elements should be all strings")
(let [elements (vec elements)]
(-rpush conn key elements)))
(defn publish
[conn topic payload]
(assert (string? topic) "expected topic to be string")
(assert (string? payload) "expected message to be a byte array")
(-publish conn topic payload))
(def ^:private schema:script
[:map {:title "script"}
[::rscript/name qualified-keyword?]
[::rscript/path ::sm/text]
[::rscript/keys {:optional true} [:vector :any]]
[::rscript/vals {:optional true} [:vector :any]]])
(def ^:private valid-script?
(sm/lazy-validator schema:script))
(defn eval
[conn script]
(assert (valid-script? script) "expected valid script")
(-eval conn script))
(defn add-listener
[conn listener]
(let [listener (cond
(map? listener)
(pubsub-listener listener)
(instance? RedisPubSubListener listener)
listener
:else
(throw (IllegalArgumentException. "invalid listener provided")))]
(-add-listener conn listener)))
(defn subscribe
[conn topic-or-topics]
(let [topics (if (vector? topic-or-topics) topic-or-topics [topic-or-topics])]
(assert (every? string? topics))
(-subscribe conn topics)))
(defn unsubscribe
[conn topic-or-topics]
(let [topics (if (vector? topic-or-topics) topic-or-topics [topic-or-topics])]
(assert (every? string? topics))
(-unsubscribe conn topics)))
(defn set-timeout
[conn timeout]
(let [timeout (ct/duration timeout)]
(-set-timeout conn timeout)))
(defn get-timeout
[conn]
(-get-timeout conn))
(defn reset-timeout
[conn]
(-reset-timeout conn))
(defn timeout-exception? (defn timeout-exception?
[cause] [cause]
@@ -374,3 +476,121 @@
(defn exception? (defn exception?
[cause] [cause]
(instance? RedisException cause)) (instance? RedisException cause))
(defn get-pooled
[cfg]
(let [pool (::pool cfg)]
(gpool/get pool)))
(defn close
[o]
(.close ^AutoCloseable o))
(defn pool
[cfg & {:as options}]
(gpool/create :create-fn (partial connect cfg options)
:destroy-fn close
:dispose-fn -reset-timeout))
(defn run!
[cfg f & args]
(if (gpool/pool? cfg)
(apply f {::pool cfg} f args)
(let [pool (::pool cfg)]
(with-open [^AutoCloseable conn (gpool/get pool)]
(apply f (assoc cfg ::conn @conn) args)))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; INITIALIZATION
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod ig/expand-key ::client
[k v]
{k (-> (d/without-nils v)
(assoc ::timeout (ct/duration "10s")))})
(def ^:private schema:client
[:map {:title "RedisClient"}
[::timer [:fn #(instance? HashedWheelTimer %)]]
[::cache ::sm/atom]
[::timeout ::ct/duration]
[::resources [:fn #(instance? DefaultClientResources %)]]])
(def check-client
(sm/check-fn schema:client))
(sm/register! ::client schema:client)
(sm/register!
{:type ::pool
:pred gpool/pool?})
(def ^:private schema:client-params
[:map {:title "redis-params"}
::wrk/netty-io-executor
::wrk/netty-executor
[::uri ::sm/uri]
[::timeout ::ct/duration]])
(def ^:private check-client-params
(sm/check-fn schema:client-params))
(defmethod ig/assert-key ::client
[_ params]
(check-client-params params))
(defmethod ig/init-key ::client
[_ {:keys [::uri ::wrk/netty-io-executor ::wrk/netty-executor] :as params}]
(l/inf :hint "initialize redis client" :uri (str uri))
(let [timer (HashedWheelTimer.)
cache (atom {})
resources (.. (DefaultClientResources/builder)
(eventExecutorGroup ^EventExecutorGroup netty-executor)
;; We provide lettuce with a shared event loop
;; group instance instead of letting lettuce to
;; create its own
(eventLoopGroupProvider
(reify io.lettuce.core.resource.EventLoopGroupProvider
(allocate [_ _] netty-io-executor)
(threadPoolSize [_]
(.executorCount ^NioEventLoopGroup netty-io-executor))
(release [_ _ _ _ _]
;; Do nothing
)
(shutdown [_ _ _ _]
;; Do nothing
)))
(timer ^Timer timer)
(build))
redis-uri (RedisURI/create ^String (str uri))
client (RedisClient/create ^ClientResources resources
^RedisURI redis-uri)]
{::client client
::cache cache
::timer timer
::timeout default-timeout
::resources resources}))
(defmethod ig/halt-key! ::client
[_ {:keys [::client ::timer ::resources]}]
(ex/ignoring (.shutdown ^RedisClient client))
(ex/ignoring (.shutdown ^ClientResources resources))
(ex/ignoring (.stop ^Timer timer)))
(defmethod ig/assert-key ::pool
[_ {:keys [::client]}]
(check-client client))
(defmethod ig/init-key ::pool
[_ cfg]
(pool cfg {:timeout (ct/duration 2000)}))
(defmethod ig/halt-key! ::pool
[_ instance]
(.close ^java.lang.AutoCloseable instance))

View File

@@ -23,6 +23,7 @@
[app.main :as-alias main] [app.main :as-alias main]
[app.metrics :as mtx] [app.metrics :as mtx]
[app.msgbus :as-alias mbus] [app.msgbus :as-alias mbus]
[app.redis :as rds]
[app.rpc.climit :as climit] [app.rpc.climit :as climit]
[app.rpc.cond :as cond] [app.rpc.cond :as cond]
[app.rpc.helpers :as rph] [app.rpc.helpers :as rph]
@@ -239,7 +240,6 @@
'app.rpc.commands.files 'app.rpc.commands.files
'app.rpc.commands.files-create 'app.rpc.commands.files-create
'app.rpc.commands.files-share 'app.rpc.commands.files-share
'app.rpc.commands.files-temp
'app.rpc.commands.files-update 'app.rpc.commands.files-update
'app.rpc.commands.files-snapshot 'app.rpc.commands.files-snapshot
'app.rpc.commands.files-thumbnails 'app.rpc.commands.files-thumbnails
@@ -262,6 +262,7 @@
::session/manager ::session/manager
::http.client/client ::http.client/client
::db/pool ::db/pool
::rds/pool
::mbus/msgbus ::mbus/msgbus
::sto/storage ::sto/storage
::mtx/metrics ::mtx/metrics

View File

@@ -21,7 +21,6 @@
[clojure.set :as set] [clojure.set :as set]
[datoteka.fs :as fs] [datoteka.fs :as fs]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px]
[promesa.exec.bulkhead :as pbh]) [promesa.exec.bulkhead :as pbh])
(:import (:import
clojure.lang.ExceptionInfo clojure.lang.ExceptionInfo
@@ -289,13 +288,9 @@
(get-limits cfg))) (get-limits cfg)))
(defn invoke! (defn invoke!
"Run a function in context of climit. "Run a function in context of climit."
Intended to be used in virtual threads." [{:keys [::rpc/climit] :as cfg} f params]
[{:keys [::executor ::rpc/climit] :as cfg} f params]
(let [f (if climit (let [f (if climit
(let [f (if (some? executor) (build-exec-chain cfg f)
(fn [cfg params] (px/await! (px/submit! executor (fn [] (f cfg params)))))
f)]
(build-exec-chain cfg f))
f)] f)]
(f cfg params))) (f cfg params)))

View File

@@ -23,14 +23,14 @@
(dissoc row :perms)) (dissoc row :perms))
(defn create-access-token (defn create-access-token
[{:keys [::db/conn ::setup/props]} profile-id name expiration] [{:keys [::db/conn] :as cfg} profile-id name expiration]
(let [created-at (ct/now) (let [token-id (uuid/next)
token-id (uuid/next) expires-at (some-> expiration (ct/in-future))
token (tokens/generate props {:iss "access-token" created-at (ct/now)
:tid token-id token (tokens/generate cfg {:iss "access-token"
:iat created-at}) :iat created-at
:tid token-id})
expires-at (some-> expiration ct/in-future)
token (db/insert! conn :access-token token (db/insert! conn :access-token
{:id token-id {:id token-id
:name name :name name

View File

@@ -6,6 +6,7 @@
(ns app.rpc.commands.auth (ns app.rpc.commands.auth
(:require (:require
[app.auth :as auth]
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm] [app.common.data.macros :as dm]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
@@ -62,7 +63,7 @@
(ex/raise :type :validation (ex/raise :type :validation
:code :account-without-password :code :account-without-password
:hint "the current account does not have password") :hint "the current account does not have password")
(let [result (profile/verify-password cfg password (:password profile))] (let [result (auth/verify-password password (:password profile))]
(when (:update result) (when (:update result)
(l/trc :hint "updating profile password" (l/trc :hint "updating profile password"
:id (str (:id profile)) :id (str (:id profile))
@@ -98,7 +99,7 @@
(profile/strip-private-attrs)) (profile/strip-private-attrs))
invitation (when-let [token (:invitation-token params)] invitation (when-let [token (:invitation-token params)]
(tokens/verify (::setup/props cfg) {:token token :iss :team-invitation})) (tokens/verify cfg {:token token :iss :team-invitation}))
;; If invitation member-id does not matches the profile-id, we just proceed to ignore the ;; If invitation member-id does not matches the profile-id, we just proceed to ignore the
;; invitation because invitations matches exactly; and user can't login with other email and ;; invitation because invitations matches exactly; and user can't login with other email and
@@ -152,11 +153,11 @@
(defn recover-profile (defn recover-profile
[{:keys [::db/conn] :as cfg} {:keys [token password]}] [{:keys [::db/conn] :as cfg} {:keys [token password]}]
(letfn [(validate-token [token] (letfn [(validate-token [token]
(let [tdata (tokens/verify (::setup/props cfg) {:token token :iss :password-recovery})] (let [tdata (tokens/verify cfg {:token token :iss :password-recovery})]
(:profile-id tdata))) (:profile-id tdata)))
(update-password [conn profile-id] (update-password [conn profile-id]
(let [pwd (profile/derive-password cfg password)] (let [pwd (auth/derive-password password)]
(db/update! conn :profile {:password pwd :is-active true} {:id profile-id}) (db/update! conn :profile {:password pwd :is-active true} {:id profile-id})
nil))] nil))]
@@ -191,7 +192,7 @@
:hint "registration disabled")) :hint "registration disabled"))
(when (contains? params :invitation-token) (when (contains? params :invitation-token)
(let [invitation (tokens/verify (::setup/props cfg) (let [invitation (tokens/verify cfg
{:token (:invitation-token params) {:token (:invitation-token params)
:iss :team-invitation})] :iss :team-invitation})]
(when-not (= (:email params) (:member-email invitation)) (when-not (= (:email params) (:member-email invitation))
@@ -248,7 +249,7 @@
:props {:newsletter-updates (or accept-newsletter-updates false)}} :props {:newsletter-updates (or accept-newsletter-updates false)}}
params (d/without-nils params) params (d/without-nils params)
token (tokens/generate (::setup/props cfg) params)] token (tokens/generate cfg params)]
(with-meta {:token token} (with-meta {:token token}
{::audit/profile-id uuid/zero}))) {::audit/profile-id uuid/zero})))
@@ -342,14 +343,14 @@
(defn send-email-verification! (defn send-email-verification!
[{:keys [::db/conn] :as cfg} profile] [{:keys [::db/conn] :as cfg} profile]
(let [vtoken (tokens/generate (::setup/props cfg) (let [vtoken (tokens/generate cfg
{:iss :verify-email {:iss :verify-email
:exp (ct/in-future "72h") :exp (ct/in-future "72h")
:profile-id (:id profile) :profile-id (:id profile)
:email (:email profile)}) :email (:email profile)})
;; NOTE: this token is mainly used for possible complains ;; NOTE: this token is mainly used for possible complains
;; identification on the sns webhook ;; identification on the sns webhook
ptoken (tokens/generate (::setup/props cfg) ptoken (tokens/generate cfg
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile) :profile-id (:id profile)
:exp (ct/in-future {:days 30})})] :exp (ct/in-future {:days 30})})]
@@ -363,7 +364,7 @@
(defn register-profile (defn register-profile
[{:keys [::db/conn ::wrk/executor] :as cfg} {:keys [token] :as params}] [{:keys [::db/conn ::wrk/executor] :as cfg} {:keys [token] :as params}]
(let [claims (tokens/verify (::setup/props cfg) {:token token :iss :prepared-register}) (let [claims (tokens/verify cfg {:token token :iss :prepared-register})
params (into claims params) params (into claims params)
profile (if-let [profile-id (:profile-id claims)] profile (if-let [profile-id (:profile-id claims)]
@@ -378,7 +379,7 @@
(not (contains? cf/flags :email-verification))) (not (contains? cf/flags :email-verification)))
params (-> params params (-> params
(assoc :is-active is-active) (assoc :is-active is-active)
(update :password #(profile/derive-password cfg %))) (update :password auth/derive-password))
profile (->> (create-profile! conn params) profile (->> (create-profile! conn params)
(create-profile-rels! conn))] (create-profile-rels! conn))]
(vary-meta profile assoc :created true)))) (vary-meta profile assoc :created true))))
@@ -386,7 +387,7 @@
created? (-> profile meta :created true?) created? (-> profile meta :created true?)
invitation (when-let [token (:invitation-token params)] invitation (when-let [token (:invitation-token params)]
(tokens/verify (::setup/props cfg) {:token token :iss :team-invitation})) (tokens/verify cfg {:token token :iss :team-invitation}))
props (-> (audit/profile->props profile) props (-> (audit/profile->props profile)
(assoc :from-invitation (some? invitation))) (assoc :from-invitation (some? invitation)))
@@ -419,7 +420,7 @@
(= (:email profile) (= (:email profile)
(:member-email invitation))) (:member-email invitation)))
(let [claims (assoc invitation :member-id (:id profile)) (let [claims (assoc invitation :member-id (:id profile))
token (tokens/generate (::setup/props cfg) claims)] token (tokens/generate cfg claims)]
(-> {:invitation-token token} (-> {:invitation-token token}
(rph/with-transform (session/create-fn cfg (:id profile))) (rph/with-transform (session/create-fn cfg (:id profile)))
(rph/with-meta {::audit/replace-props props (rph/with-meta {::audit/replace-props props
@@ -493,14 +494,14 @@
(defn- request-profile-recovery (defn- request-profile-recovery
[{:keys [::db/conn] :as cfg} {:keys [email] :as params}] [{:keys [::db/conn] :as cfg} {:keys [email] :as params}]
(letfn [(create-recovery-token [{:keys [id] :as profile}] (letfn [(create-recovery-token [{:keys [id] :as profile}]
(let [token (tokens/generate (::setup/props cfg) (let [token (tokens/generate cfg
{:iss :password-recovery {:iss :password-recovery
:exp (ct/in-future "15m") :exp (ct/in-future "15m")
:profile-id id})] :profile-id id})]
(assoc profile :token token))) (assoc profile :token token)))
(send-email-notification [conn profile] (send-email-notification [conn profile]
(let [ptoken (tokens/generate (::setup/props cfg) (let [ptoken (tokens/generate cfg
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile) :profile-id (:id profile)
:exp (ct/in-future {:days 30})})] :exp (ct/in-future {:days 30})})]

View File

@@ -28,7 +28,6 @@
[app.tasks.file-gc] [app.tasks.file-gc]
[app.util.services :as sv] [app.util.services :as sv]
[app.worker :as-alias wrk] [app.worker :as-alias wrk]
[promesa.exec :as px]
[yetti.response :as yres])) [yetti.response :as yres]))
(set! *warn-on-reflection* true) (set! *warn-on-reflection* true)
@@ -94,7 +93,7 @@
;; --- Command: import-binfile ;; --- Command: import-binfile
(defn- import-binfile (defn- import-binfile
[{:keys [::db/pool ::wrk/executor] :as cfg} {:keys [profile-id project-id version name file]}] [{:keys [::db/pool] :as cfg} {:keys [profile-id project-id version name file]}]
(let [team (teams/get-team pool (let [team (teams/get-team pool
:profile-id profile-id :profile-id profile-id
:project-id project-id) :project-id project-id)
@@ -105,13 +104,9 @@
(assoc ::bfc/name name) (assoc ::bfc/name name)
(assoc ::bfc/input (:path file))) (assoc ::bfc/input (:path file)))
;; NOTE: the importation process performs some operations that are
;; not very friendly with virtual threads, and for avoid
;; unexpected blocking of other concurrent operations we dispatch
;; that operation to a dedicated executor.
result (case (int version) result (case (int version)
1 (px/invoke! executor (partial bf.v1/import-files! cfg)) 1 (bf.v1/import-files! cfg)
3 (px/invoke! executor (partial bf.v3/import-files! cfg)))] 3 (bf.v3/import-files! cfg))]
(db/update! pool :project (db/update! pool :project
{:modified-at (ct/now)} {:modified-at (ct/now)}

View File

@@ -6,6 +6,7 @@
(ns app.rpc.commands.comments (ns app.rpc.commands.comments
(:require (:require
[app.binfile.common :as bfc]
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm] [app.common.data.macros :as dm]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
@@ -163,34 +164,16 @@
(def xf-decode-row (def xf-decode-row
(map decode-row)) (map decode-row))
(def ^:private
sql:get-file
"SELECT f.id, f.modified_at, f.revn, f.features, f.name,
f.project_id, p.team_id, f.data,
f.data_ref_id, f.data_backend
FROM file as f
INNER JOIN project as p on (p.id = f.project_id)
WHERE f.id = ?
AND (f.deleted_at IS NULL OR f.deleted_at > now())")
(defn- get-file (defn- get-file
"A specialized version of get-file for comments module." "A specialized version of get-file for comments module."
[cfg file-id page-id] [cfg file-id page-id]
(let [file (db/exec-one! cfg [sql:get-file file-id])]
(when-not file
(ex/raise :type :not-found
:code :object-not-found
:hint "file not found"))
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg file-id)] (binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg file-id)]
(let [file (->> file (let [file (bfc/get-file cfg file-id)
(files/decode-row)
(feat.fdata/resolve-file-data cfg))
data (get file :data)] data (get file :data)]
(-> file (-> file
(assoc :page-name (dm/get-in data [:pages-index page-id :name])) (assoc :page-name (dm/get-in data [:pages-index page-id :name]))
(assoc :page-id page-id) (assoc :page-id page-id)
(dissoc :data)))))) (dissoc :data)))))
;; FIXME: rename ;; FIXME: rename
(defn- get-comment-thread (defn- get-comment-thread
@@ -251,7 +234,9 @@
(files/check-comment-permissions! conn profile-id file-id share-id) (files/check-comment-permissions! conn profile-id file-id share-id)
(get-comment-threads conn profile-id file-id)))) (get-comment-threads conn profile-id file-id))))
(def ^:private sql:comment-threads (defn- get-comment-threads-sql
[where]
(str/ffmt
"SELECT DISTINCT ON (ct.id) "SELECT DISTINCT ON (ct.id)
ct.*, ct.*,
pf.fullname AS owner_fullname, pf.fullname AS owner_fullname,
@@ -274,11 +259,14 @@
INNER JOIN project AS p ON (p.id = f.project_id) INNER JOIN project AS p ON (p.id = f.project_id)
LEFT JOIN comment_thread_status AS cts ON (cts.thread_id = ct.id AND cts.profile_id = ?) LEFT JOIN comment_thread_status AS cts ON (cts.thread_id = ct.id AND cts.profile_id = ?)
LEFT JOIN profile AS pf ON (ct.owner_id = pf.id) LEFT JOIN profile AS pf ON (ct.owner_id = pf.id)
WINDOW w AS (PARTITION BY c.thread_id ORDER BY c.created_at ASC)") WHERE f.deleted_at IS NULL
AND p.deleted_at IS NULL
%1
WINDOW w AS (PARTITION BY c.thread_id ORDER BY c.created_at ASC)"
where))
(def ^:private sql:comment-threads-by-file-id (def ^:private sql:comment-threads-by-file-id
(str "WITH threads AS (" sql:comment-threads ")" (get-comment-threads-sql "AND ct.file_id = ?"))
"SELECT * FROM threads WHERE file_id = ?"))
(defn- get-comment-threads (defn- get-comment-threads
[conn profile-id file-id] [conn profile-id file-id]
@@ -287,7 +275,30 @@
;; --- COMMAND: Get Unread Comment Threads ;; --- COMMAND: Get Unread Comment Threads
(declare ^:private get-unread-comment-threads) (def ^:private sql:unread-all-comment-threads-by-team
(str "WITH threads AS ("
(get-comment-threads-sql "AND p.team_id = ?")
")"
"SELECT t.* FROM threads AS t
WHERE t.count_unread_comments > 0"))
(def ^:private sql:unread-partial-comment-threads-by-team
(str "WITH threads AS ("
(get-comment-threads-sql "AND p.team_id = ? AND (ct.owner_id = ? OR ? = ANY(ct.mentions))")
")"
"SELECT t.* FROM threads AS t
WHERE t.count_unread_comments > 0"))
(defn- get-unread-comment-threads
[cfg profile-id team-id]
(let [profile (-> (db/get cfg :profile {:id profile-id} ::db/remove-deleted false)
(profile/decode-row))
notify (or (-> profile :props :notifications :dashboard-comments) :all)
result (case notify
:all (db/exec! cfg [sql:unread-all-comment-threads-by-team profile-id team-id])
:partial (db/exec! cfg [sql:unread-partial-comment-threads-by-team profile-id team-id profile-id profile-id])
[])]
(into [] xf-decode-row result)))
(def ^:private (def ^:private
schema:get-unread-comment-threads schema:get-unread-comment-threads
@@ -298,41 +309,8 @@
{::doc/added "1.15" {::doc/added "1.15"
::sm/params schema:get-unread-comment-threads} ::sm/params schema:get-unread-comment-threads}
[cfg {:keys [::rpc/profile-id team-id] :as params}] [cfg {:keys [::rpc/profile-id team-id] :as params}]
(db/run! (teams/check-read-permissions! cfg profile-id team-id)
cfg (get-unread-comment-threads cfg profile-id team-id))
(fn [{:keys [::db/conn]}]
(teams/check-read-permissions! conn profile-id team-id)
(get-unread-comment-threads conn profile-id team-id))))
(def sql:unread-all-comment-threads-by-team
(str "WITH threads AS (" sql:comment-threads ")"
"SELECT * FROM threads WHERE count_unread_comments > 0 AND team_id = ?"))
;; The partial configuration will retrieve only comments created by the user and
;; threads that have a mention to the user.
(def sql:unread-partial-comment-threads-by-team
(str "WITH threads AS (" sql:comment-threads ")"
"SELECT * FROM threads
WHERE count_unread_comments > 0
AND team_id = ?
AND (owner_id = ? OR ? = ANY(mentions))"))
(defn- get-unread-comment-threads
[conn profile-id team-id]
(let [profile (-> (db/get conn :profile {:id profile-id})
(profile/decode-row))
notify (or (-> profile :props :notifications :dashboard-comments) :all)]
(case notify
:all
(->> (db/exec! conn [sql:unread-all-comment-threads-by-team profile-id team-id])
(into [] xf-decode-row))
:partial
(->> (db/exec! conn [sql:unread-partial-comment-threads-by-team profile-id team-id profile-id profile-id])
(into [] xf-decode-row))
[])))
;; --- COMMAND: Get Single Comment Thread ;; --- COMMAND: Get Single Comment Thread
@@ -343,16 +321,17 @@
[:id ::sm/uuid] [:id ::sm/uuid]
[:share-id {:optional true} [:maybe ::sm/uuid]]]) [:share-id {:optional true} [:maybe ::sm/uuid]]])
(def ^:private sql:get-comment-thread
(get-comment-threads-sql "AND ct.file_id = ? AND ct.id = ?"))
(sv/defmethod ::get-comment-thread (sv/defmethod ::get-comment-thread
{::doc/added "1.15" {::doc/added "1.15"
::sm/params schema:get-comment-thread} ::sm/params schema:get-comment-thread}
[cfg {:keys [::rpc/profile-id file-id id share-id] :as params}] [cfg {:keys [::rpc/profile-id file-id id share-id] :as params}]
(db/run! cfg (fn [{:keys [::db/conn]}] (db/run! cfg (fn [{:keys [::db/conn]}]
(files/check-comment-permissions! conn profile-id file-id share-id) (files/check-comment-permissions! conn profile-id file-id share-id)
(let [sql (str "WITH threads AS (" sql:comment-threads ")" (some-> (db/exec-one! conn [sql:get-comment-thread profile-id file-id id])
"SELECT * FROM threads WHERE id = ? AND file_id = ?")] (decode-row)))))
(-> (db/exec-one! conn [sql profile-id id file-id])
(decode-row))))))
;; --- COMMAND: Retrieve Comments ;; --- COMMAND: Retrieve Comments

View File

@@ -7,6 +7,7 @@
(ns app.rpc.commands.demo (ns app.rpc.commands.demo
"A demo specific mutations." "A demo specific mutations."
(:require (:require
[app.auth :refer [derive-password]]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.time :as ct] [app.common.time :as ct]
[app.config :as cf] [app.config :as cf]
@@ -14,7 +15,6 @@
[app.loggers.audit :as audit] [app.loggers.audit :as audit]
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.commands.auth :as auth] [app.rpc.commands.auth :as auth]
[app.rpc.commands.profile :as profile]
[app.rpc.doc :as-alias doc] [app.rpc.doc :as-alias doc]
[app.util.services :as sv] [app.util.services :as sv]
[buddy.core.codecs :as bc] [buddy.core.codecs :as bc]
@@ -45,8 +45,9 @@
params {:email email params {:email email
:fullname fullname :fullname fullname
:is-active true :is-active true
:is-demo true
:deleted-at (ct/in-future (cf/get-deletion-delay)) :deleted-at (ct/in-future (cf/get-deletion-delay))
:password (profile/derive-password cfg password) :password (derive-password password)
:props {}} :props {}}
profile (db/tx-run! cfg (fn [{:keys [::db/conn]}] profile (db/tx-run! cfg (fn [{:keys [::db/conn]}]
(->> (auth/create-profile! conn params) (->> (auth/create-profile! conn params)

View File

@@ -17,6 +17,7 @@
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.schema.desc-js-like :as-alias smdj] [app.common.schema.desc-js-like :as-alias smdj]
[app.common.time :as ct] [app.common.time :as ct]
[app.common.transit :as t]
[app.common.types.components-list :as ctkl] [app.common.types.components-list :as ctkl]
[app.common.types.file :as ctf] [app.common.types.file :as ctf]
[app.common.uri :as uri] [app.common.uri :as uri]
@@ -24,10 +25,11 @@
[app.db :as db] [app.db :as db]
[app.db.sql :as-alias sql] [app.db.sql :as-alias sql]
[app.features.fdata :as feat.fdata] [app.features.fdata :as feat.fdata]
[app.features.file-migrations :as feat.fmigr]
[app.features.logical-deletion :as ldel] [app.features.logical-deletion :as ldel]
[app.loggers.audit :as-alias audit] [app.loggers.audit :as-alias audit]
[app.loggers.webhooks :as-alias webhooks] [app.loggers.webhooks :as-alias webhooks]
[app.msgbus :as mbus]
[app.redis :as rds]
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.commands.projects :as projects] [app.rpc.commands.projects :as projects]
[app.rpc.commands.teams :as teams] [app.rpc.commands.teams :as teams]
@@ -39,8 +41,7 @@
[app.util.pointer-map :as pmap] [app.util.pointer-map :as pmap]
[app.util.services :as sv] [app.util.services :as sv]
[app.worker :as wrk] [app.worker :as wrk]
[cuerdas.core :as str] [cuerdas.core :as str]))
[promesa.exec :as px]))
;; --- FEATURES ;; --- FEATURES
@@ -55,12 +56,10 @@
(ct/duration {:days 7})) (ct/duration {:days 7}))
(defn decode-row (defn decode-row
[{:keys [data changes features] :as row}] [{:keys [features] :as row}]
(when row (when row
(cond-> row (cond-> row
features (assoc :features (db/decode-pgarray features #{})) (db/pgarray? features) (assoc :features (db/decode-pgarray features #{})))))
changes (assoc :changes (blob/decode changes))
data (assoc :data (blob/decode data)))))
(defn check-version! (defn check-version!
[file] [file]
@@ -84,8 +83,10 @@
fpr.is_admin, fpr.is_admin,
fpr.can_edit fpr.can_edit
from file_profile_rel as fpr from file_profile_rel as fpr
inner join file as f on (f.id = fpr.file_id)
where fpr.file_id = ? where fpr.file_id = ?
and fpr.profile_id = ? and fpr.profile_id = ?
and f.deleted_at is null
union all union all
select tpr.is_owner, select tpr.is_owner,
tpr.is_admin, tpr.is_admin,
@@ -95,6 +96,7 @@
inner join file as f on (p.id = f.project_id) inner join file as f on (p.id = f.project_id)
where f.id = ? where f.id = ?
and tpr.profile_id = ? and tpr.profile_id = ?
and f.deleted_at is null
union all union all
select ppr.is_owner, select ppr.is_owner,
ppr.is_admin, ppr.is_admin,
@@ -102,7 +104,8 @@
from project_profile_rel as ppr from project_profile_rel as ppr
inner join file as f on (f.project_id = ppr.project_id) inner join file as f on (f.project_id = ppr.project_id)
where f.id = ? where f.id = ?
and ppr.profile_id = ?") and ppr.profile_id = ?
and f.deleted_at is null")
(defn get-file-permissions (defn get-file-permissions
[conn profile-id file-id] [conn profile-id file-id]
@@ -207,93 +210,11 @@
schema:get-file schema:get-file
[:map {:title "get-file"} [:map {:title "get-file"}
[:features {:optional true} ::cfeat/features] [:features {:optional true} ::cfeat/features]
[:id ::sm/uuid] [:id ::sm/uuid]])
[:project-id {:optional true} ::sm/uuid]])
(defn- migrate-file
[{:keys [::db/conn] :as cfg} {:keys [id] :as file} {:keys [read-only?]}]
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)
pmap/*tracked* (pmap/create-tracked)]
(let [libs (delay (bfc/get-resolved-file-libraries cfg file))
;; For avoid unnecesary overhead of creating multiple pointers and
;; handly internally with objects map in their worst case (when
;; probably all shapes and all pointers will be readed in any
;; case), we just realize/resolve them before applying the
;; migration to the file
file (-> file
(update :data feat.fdata/process-pointers deref)
(update :data feat.fdata/process-objects (partial into {}))
(fmg/migrate-file libs))]
(if (or read-only? (db/read-only? conn))
file
(let [;; When file is migrated, we break the rule of no perform
;; mutations on get operations and update the file with all
;; migrations applied
file (if (contains? (:features file) "fdata/objects-map")
(feat.fdata/enable-objects-map file)
file)
file (if (contains? (:features file) "fdata/pointer-map")
(feat.fdata/enable-pointer-map file)
file)]
(db/update! conn :file
{:data (blob/encode (:data file))
:version (:version file)
:features (db/create-array conn "text" (:features file))}
{:id id}
{::db/return-keys false})
(when (contains? (:features file) "fdata/pointer-map")
(feat.fdata/persist-pointers! cfg id))
(feat.fmigr/upsert-migrations! conn file)
(feat.fmigr/resolve-applied-migrations cfg file))))))
(defn get-file
[{:keys [::db/conn ::wrk/executor] :as cfg} id
& {:keys [project-id
migrate?
include-deleted?
lock-for-update?
preload-pointers?]
:or {include-deleted? false
lock-for-update? false
migrate? true
preload-pointers? false}
:as options}]
(assert (db/connection? conn) "expected cfg with valid connection")
(let [params (merge {:id id}
(when (some? project-id)
{:project-id project-id}))
file (->> (db/get conn :file params
{::db/check-deleted (not include-deleted?)
::db/remove-deleted (not include-deleted?)
::sql/for-update lock-for-update?})
(feat.fmigr/resolve-applied-migrations cfg)
(feat.fdata/resolve-file-data cfg))
;; NOTE: we perform the file decoding in a separate thread
;; because it has heavy and synchronous operations for
;; decoding file body that are not very friendly with virtual
;; threads.
file (px/invoke! executor #(decode-row file))
file (if (and migrate? (fmg/need-migration? file))
(migrate-file cfg file options)
file)]
(if preload-pointers?
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)]
(update file :data feat.fdata/process-pointers deref))
file)))
(defn get-minimal-file (defn get-minimal-file
[cfg id & {:as opts}] [cfg id & {:as opts}]
(let [opts (assoc opts ::sql/columns [:id :modified-at :deleted-at :revn :vern :data-ref-id :data-backend])] (let [opts (assoc opts ::sql/columns [:id :modified-at :deleted-at :revn :vern])]
(db/get cfg :file {:id id} opts))) (db/get cfg :file {:id id} opts)))
(defn- get-minimal-file-with-perms (defn- get-minimal-file-with-perms
@@ -333,9 +254,9 @@
:project-id project-id :project-id project-id
:file-id id) :file-id id)
file (-> (get-file cfg id :project-id project-id) file (-> (bfc/get-file cfg id
:project-id project-id)
(assoc :permissions perms) (assoc :permissions perms)
(assoc :team-id (:id team))
(check-version!))] (check-version!))]
(-> (cfeat/get-team-enabled-features cf/flags team) (-> (cfeat/get-team-enabled-features cf/flags team)
@@ -349,8 +270,7 @@
;; return a complete file ;; return a complete file
(if (and (contains? (:features file) "fdata/pointer-map") (if (and (contains? (:features file) "fdata/pointer-map")
(not (contains? (:features params) "fdata/pointer-map"))) (not (contains? (:features params) "fdata/pointer-map")))
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)] (feat.fdata/realize-pointers cfg file)
(update file :data feat.fdata/process-pointers deref))
file) file)
;; This operation is needed for backward comapatibility with ;; This operation is needed for backward comapatibility with
@@ -358,7 +278,7 @@
;; just converts all objects map instaces to plain maps ;; just converts all objects map instaces to plain maps
(if (and (contains? (:features file) "fdata/objects-map") (if (and (contains? (:features file) "fdata/objects-map")
(not (contains? (:features params) "fdata/objects-map"))) (not (contains? (:features params) "fdata/objects-map")))
(update file :data feat.fdata/process-objects (partial into {})) (feat.fdata/realize-objects cfg file)
file))))) file)))))
;; --- COMMAND QUERY: get-file-fragment (by id) ;; --- COMMAND QUERY: get-file-fragment (by id)
@@ -378,10 +298,8 @@
(defn- get-file-fragment (defn- get-file-fragment
[cfg file-id fragment-id] [cfg file-id fragment-id]
(let [resolve-file-data (partial feat.fdata/resolve-file-data cfg)] (some-> (db/get cfg :file-data {:file-id file-id :id fragment-id :type "fragment"})
(some-> (db/get cfg :file-data-fragment {:file-id file-id :id fragment-id}) (update :data blob/decode)))
(resolve-file-data)
(update :data blob/decode))))
(sv/defmethod ::get-file-fragment (sv/defmethod ::get-file-fragment
"Retrieve a file fragment by its ID. Only authenticated users." "Retrieve a file fragment by its ID. Only authenticated users."
@@ -540,7 +458,7 @@
(let [perms (get-permissions conn profile-id file-id share-id) (let [perms (get-permissions conn profile-id file-id share-id)
file (get-file cfg file-id :read-only? true) file (bfc/get-file cfg file-id :read-only? true)
proj (db/get conn :project {:id (:project-id file)}) proj (db/get conn :project {:id (:project-id file)})
@@ -596,87 +514,136 @@
;; --- COMMAND QUERY: get-team-shared-files ;; --- COMMAND QUERY: get-team-shared-files
(defn- components-and-variants (defn- get-components-with-variants
"Return a set with all the variant-ids, and a list of components, but with "Return a set with all the variant-ids, and a list of components, but
only one component by variant" with only one component by variant.
[components]
(let [{:keys [variant-ids components]} Returns a vector of unique components and a set of all variant ids"
(reduce (fn [{:keys [variant-ids components] :as acc} {:keys [variant-id] :as component}] [fdata]
(loop [variant-ids #{}
components' []
components (ctkl/components-seq fdata)]
(if-let [{:keys [variant-id] :as component} (first components)]
(cond (cond
(nil? variant-id) (nil? variant-id)
{:variant-ids variant-ids :components (conj components component)} (recur variant-ids
(conj components' component)
(rest components))
(contains? variant-ids variant-id) (contains? variant-ids variant-id)
acc (recur variant-ids
components'
(rest components))
:else :else
{:variant-ids (conj variant-ids variant-id) :components (conj components component)})) (recur (conj variant-ids variant-id)
{:variant-ids #{} :components []} (conj components' component)
components)] (rest components)))
{:components components
:variant-ids variant-ids})) [(d/index-by :id components') variant-ids])))
(defn- sample-assets
[assets limit]
(let [assets (into [] (map val) assets)]
{:count (count assets)
:sample (->> assets
(sort-by #(str/lower (:name %)))
(into [] (take limit)))}))
(defn- calculate-library-summary
"Calculate the file library summary (counters and samples)"
[{:keys [data] :as file}]
(let [load-objects
(fn [sample]
(mapv #(ctf/load-component-objects data %) sample))
[components variant-ids]
(get-components-with-variants data)
components-sample
(-> (sample-assets components 4)
(update :sample load-objects))]
{:components components-sample
:variants {:count (count variant-ids)}
:colors (sample-assets (:colors data) 3)
:typographies (sample-assets (:typographies data) 3)}))
(def ^:private file-summary-cache-key-ttl
(ct/duration {:days 30}))
(def file-summary-cache-key-prefix
"penpot.library-summary.")
(defn- get-file-with-summary
"Get a file without data with a summary of its local library content"
[cfg id]
(let [get-from-cache
(fn [{:keys [::rds/conn]} cache-key]
(when-let [result (rds/get conn cache-key)]
(let [file (bfc/get-file cfg id :load-data? false)
summary (t/decode-str result)]
(-> (assoc file :library-summary summary)
(dissoc :data)))))
calculate-from-db
(fn []
(let [file (bfc/get-file cfg id)
result (binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)]
(calculate-library-summary file))]
(-> file
(assoc :library-summary result)
(dissoc :legacy-data)
(dissoc :data))))
persist-to-cache
(fn [{:keys [::rds/conn]} data cache-key]
(rds/set conn cache-key (t/encode-str data)
(rds/build-set-args {:ex file-summary-cache-key-ttl})))]
(if (contains? cf/flags :redis-cache)
(let [cache-key (str file-summary-cache-key-prefix id)]
(or (rds/run! cfg get-from-cache cache-key)
(let [file (calculate-from-db)]
(rds/run! cfg persist-to-cache (:library-summary file) cache-key)
file)))
(calculate-from-db))))
(def ^:private sql:team-shared-files (def ^:private sql:team-shared-files
"select f.id, "WITH file_library_agg AS (
f.revn, SELECT flr.file_id,
f.vern, coalesce(array_agg(flr.library_file_id) filter (WHERE flr.library_file_id IS NOT NULL), '{}') AS library_file_ids
f.data, FROM file_library_rel flr
f.project_id, GROUP BY flr.file_id
f.created_at, )
f.modified_at,
f.data_backend,
f.data_ref_id,
f.name,
f.version,
f.is_shared,
ft.media_id,
p.team_id
from file as f
inner join project as p on (p.id = f.project_id)
left join file_thumbnail as ft on (ft.file_id = f.id and ft.revn = f.revn and ft.deleted_at is null)
where f.is_shared = true
and f.deleted_at is null
and p.deleted_at is null
and p.team_id = ?
order by f.modified_at desc")
(defn- get-library-summary SELECT f.id,
[cfg {:keys [id data] :as file}] fla.library_file_ids,
(letfn [(assets-sample [assets limit] ft.media_id AS thumbnail_id
(let [sorted-assets (->> (vals assets) FROM file AS f
(sort-by #(str/lower (:name %))))] INNER JOIN project AS p ON (p.id = f.project_id)
{:count (count sorted-assets) LEFT JOIN file_thumbnail AS ft ON (ft.file_id = f.id AND ft.revn = f.revn AND ft.deleted_at IS NULL)
:sample (into [] (take limit sorted-assets))}))] LEFT JOIN file_library_agg AS fla ON (fla.file_id = f.id)
WHERE f.is_shared = true
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)] AND f.deleted_at IS NULL
(let [load-objects (fn [component] AND p.deleted_at IS NULL
(ctf/load-component-objects data component)) AND p.team_id = ?
comps-and-variants (components-and-variants (ctkl/components-seq data)) ORDER BY f.modified_at DESC")
components (into {} (map (juxt :id identity) (:components comps-and-variants)))
components-sample (-> (assets-sample components 4)
(update :sample #(mapv load-objects %))
(assoc :variants-count (-> comps-and-variants :variant-ids count)))]
{:components components-sample
:media (assets-sample (:media data) 3)
:colors (assets-sample (:colors data) 3)
:typographies (assets-sample (:typographies data) 3)}))))
(defn- get-team-shared-files (defn- get-team-shared-files
[{:keys [::db/conn] :as cfg} {:keys [team-id profile-id]}] [{:keys [::db/conn] :as cfg} {:keys [team-id profile-id]}]
(teams/check-read-permissions! conn profile-id team-id) (teams/check-read-permissions! conn profile-id team-id)
(->> (db/exec! conn [sql:team-shared-files team-id])
(into #{} (comp (let [process-row
;; NOTE: this decode operation is a workaround for a (fn [{:keys [id library-file-ids]}]
;; fast fix, this should be approached with a more (let [file (get-file-with-summary cfg id)]
;; efficient implementation, for now it loads all (assoc file :library-file-ids (db/decode-pgarray library-file-ids #{}))))
;; the files in memory.
(map (partial bfc/decode-file cfg)) xform
(map (fn [row] (map process-row)]
(if-let [media-id (:media-id row)]
(-> row (->> (db/plan conn [sql:team-shared-files team-id] {:fetch-size 1})
(dissoc :media-id) (transduce xform conj #{}))))
(assoc :thumbnail-id media-id))
(dissoc row :media-id))))
(map #(assoc % :library-summary (get-library-summary cfg %)))
(map #(dissoc % :data))))))
(def ^:private schema:get-team-shared-files (def ^:private schema:get-team-shared-files
[:map {:title "get-team-shared-files"} [:map {:title "get-team-shared-files"}
@@ -689,6 +656,28 @@
[cfg {:keys [::rpc/profile-id] :as params}] [cfg {:keys [::rpc/profile-id] :as params}]
(db/tx-run! cfg get-team-shared-files (assoc params :profile-id profile-id))) (db/tx-run! cfg get-team-shared-files (assoc params :profile-id profile-id)))
;; --- COMMAND QUERY: get-file-summary
(defn- get-file-summary
[cfg id]
(let [file (get-file-with-summary cfg id)]
(-> (:library-summary file)
(assoc :name (:name file)))))
(def ^:private
schema:get-file-summary
[:map {:title "get-file-summary"}
[:id ::sm/uuid]])
(sv/defmethod ::get-file-summary
"Retrieve a file summary by its ID. Only authenticated users."
{::doc/added "1.20"
::sm/params schema:get-file-summary}
[cfg {:keys [::rpc/profile-id id] :as params}]
(check-read-permissions! cfg profile-id id)
(get-file-summary cfg id))
;; --- COMMAND QUERY: get-file-libraries ;; --- COMMAND QUERY: get-file-libraries
(def ^:private schema:get-file-libraries (def ^:private schema:get-file-libraries
@@ -707,7 +696,6 @@
;; --- COMMAND QUERY: Files that use this File library ;; --- COMMAND QUERY: Files that use this File library
(def ^:private sql:library-using-files (def ^:private sql:library-using-files
"SELECT f.id, "SELECT f.id,
f.name f.name
@@ -777,51 +765,14 @@
(teams/check-read-permissions! conn profile-id team-id) (teams/check-read-permissions! conn profile-id team-id)
(get-team-recent-files conn team-id))) (get-team-recent-files conn team-id)))
;; --- COMMAND QUERY: get-file-summary
(defn- get-file-summary
[{:keys [::db/conn] :as cfg} {:keys [profile-id id project-id] :as params}]
(check-read-permissions! conn profile-id id)
(let [team (teams/get-team conn
:profile-id profile-id
:project-id project-id
:file-id id)
file (get-file cfg id
:project-id project-id
:read-only? true)]
(-> (cfeat/get-team-enabled-features cf/flags team)
(cfeat/check-client-features! (:features params))
(cfeat/check-file-features! (:features file)))
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)]
(let [components-and-variants (components-and-variants (ctkl/components-seq (:data file)))]
{:name (:name file)
:components-count (-> components-and-variants :components count)
:variants-count (-> components-and-variants :variant-ids count)
:graphics-count (count (get-in file [:data :media] []))
:colors-count (count (get-in file [:data :colors] []))
:typography-count (count (get-in file [:data :typographies] []))}))))
(sv/defmethod ::get-file-summary
"Retrieve a file summary by its ID. Only authenticated users."
{::doc/added "1.20"
::sm/params schema:get-file}
[cfg {:keys [::rpc/profile-id] :as params}]
(db/tx-run! cfg get-file-summary (assoc params :profile-id profile-id)))
;; --- COMMAND QUERY: get-file-info ;; --- COMMAND QUERY: get-file-info
(defn- get-file-info (defn- get-file-info
[{:keys [::db/conn] :as cfg} {:keys [id] :as params}] [{:keys [::db/conn] :as cfg} {:keys [id] :as params}]
(db/get* conn :file (db/get conn :file
{:id id} {:id id}
{::sql/columns [:id]})) {::sql/columns [:id :deleted-at]}))
(sv/defmethod ::get-file-info (sv/defmethod ::get-file-info
"Retrieve minimal file info by its ID." "Retrieve minimal file info by its ID."
@@ -881,7 +832,7 @@
;; --- MUTATION COMMAND: set-file-shared ;; --- MUTATION COMMAND: set-file-shared
(def sql:get-referenced-files (def ^:private sql:get-referenced-files
"SELECT f.id "SELECT f.id
FROM file_library_rel AS flr FROM file_library_rel AS flr
INNER JOIN file AS f ON (f.id = flr.file_id) INNER JOIN file AS f ON (f.id = flr.file_id)
@@ -892,13 +843,12 @@
(defn- absorb-library-by-file! (defn- absorb-library-by-file!
[cfg ldata file-id] [cfg ldata file-id]
(dm/assert! (assert (db/connection-map? cfg)
"expected cfg with valid connection" "expected cfg with valid connection")
(db/connection-map? cfg))
(binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg file-id) (binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg file-id)
pmap/*tracked* (pmap/create-tracked)] pmap/*tracked* (pmap/create-tracked)]
(let [file (-> (get-file cfg file-id (let [file (-> (bfc/get-file cfg file-id
:include-deleted? true :include-deleted? true
:lock-for-update? true) :lock-for-update? true)
(update :data ctf/absorb-assets ldata))] (update :data ctf/absorb-assets ldata))]
@@ -907,39 +857,35 @@
:library-id (str (:id ldata)) :library-id (str (:id ldata))
:file-id (str file-id)) :file-id (str file-id))
(db/update! cfg :file (bfc/update-file! cfg {:id file-id
{:revn (inc (:revn file)) :migrations (:migrations file)
:data (blob/encode (:data file)) :revn (inc (:revn file))
:data (:data file)
:modified-at (ct/now) :modified-at (ct/now)
:has-media-trimmed false} :has-media-trimmed false}))))
{:id file-id})
(feat.fdata/persist-pointers! cfg file-id))))
(defn- absorb-library (defn- absorb-library
"Find all files using a shared library, and absorb all library assets "Find all files using a shared library, and absorb all library assets
into the file local libraries" into the file local libraries"
[cfg {:keys [id] :as library}] [cfg {:keys [id data] :as library}]
(dm/assert! (assert (db/connection-map? cfg)
"expected cfg with valid connection" "expected cfg with valid connection")
(db/connection-map? cfg))
(let [ldata (binding [pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)] (let [ids (->> (db/exec! cfg [sql:get-referenced-files id])
(-> library :data (feat.fdata/process-pointers deref))) (sequence bfc/xf-map-id))]
ids (->> (db/exec! cfg [sql:get-referenced-files id])
(map :id))]
(l/trc :hint "absorbing library" (l/trc :hint "absorbing library"
:library-id (str id) :library-id (str id)
:files (str/join "," (map str ids))) :files (str/join "," (map str ids)))
(run! (partial absorb-library-by-file! cfg ldata) ids) (run! (partial absorb-library-by-file! cfg data) ids)
library)) library))
(defn absorb-library! (defn absorb-library!
[{:keys [::db/conn] :as cfg} id] [{:keys [::db/conn] :as cfg} id]
(let [file (-> (get-file cfg id (let [file (-> (bfc/get-file cfg id
:realize? true
:lock-for-update? true :lock-for-update? true
:include-deleted? true) :include-deleted? true)
(check-version!)) (check-version!))
@@ -1038,7 +984,14 @@
(let [team (teams/get-team conn (let [team (teams/get-team conn
:profile-id profile-id :profile-id profile-id
:file-id id) :file-id id)
file (mark-file-deleted conn team id)] file (mark-file-deleted conn team id)
msgbus (::mbus/msgbus cfg)]
(mbus/pub! msgbus
:topic id
:message {:type :file-deleted
:file-id id
:profile-id profile-id})
(rph/with-meta (rph/wrap) (rph/with-meta (rph/wrap)
{::audit/props {:project-id (:project-id file) {::audit/props {:project-id (:project-id file)
@@ -1071,6 +1024,7 @@
[:library-id ::sm/uuid]]) [:library-id ::sm/uuid]])
(sv/defmethod ::link-file-to-library (sv/defmethod ::link-file-to-library
"Link a file to a library. Returns the recursive list of libraries used by that library"
{::doc/added "1.17" {::doc/added "1.17"
::webhooks/event? true ::webhooks/event? true
::sm/params schema:link-file-to-library} ::sm/params schema:link-file-to-library}
@@ -1084,7 +1038,8 @@
(fn [{:keys [::db/conn]}] (fn [{:keys [::db/conn]}]
(check-edition-permissions! conn profile-id file-id) (check-edition-permissions! conn profile-id file-id)
(check-edition-permissions! conn profile-id library-id) (check-edition-permissions! conn profile-id library-id)
(link-file-to-library conn params)))) (link-file-to-library conn params)
(bfc/get-libraries cfg [library-id]))))
;; --- MUTATION COMMAND: unlink-file-from-library ;; --- MUTATION COMMAND: unlink-file-from-library

View File

@@ -8,6 +8,7 @@
(:require (:require
[app.binfile.common :as bfc] [app.binfile.common :as bfc]
[app.common.features :as cfeat] [app.common.features :as cfeat]
[app.common.files.migrations :as fmg]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct] [app.common.time :as ct]
[app.common.types.file :as ctf] [app.common.types.file :as ctf]
@@ -45,12 +46,14 @@
(binding [pmap/*tracked* (pmap/create-tracked) (binding [pmap/*tracked* (pmap/create-tracked)
cfeat/*current* features] cfeat/*current* features]
(let [file (ctf/make-file {:id id (let [file (ctf/make-file {:id id
:project-id project-id :project-id project-id
:name name :name name
:revn revn :revn revn
:is-shared is-shared :is-shared is-shared
:features features :features features
:migrations fmg/available-migrations
:ignore-sync-until ignore-sync-until :ignore-sync-until ignore-sync-until
:created-at modified-at :created-at modified-at
:deleted-at deleted-at} :deleted-at deleted-at}
@@ -66,7 +69,7 @@
{:modified-at (ct/now)} {:modified-at (ct/now)}
{:id project-id}) {:id project-id})
file))) (bfc/get-file cfg (:id file)))))
(def ^:private schema:create-file (def ^:private schema:create-file
[:map {:title "create-file"} [:map {:title "create-file"}

View File

@@ -8,52 +8,20 @@
(:require (:require
[app.binfile.common :as bfc] [app.binfile.common :as bfc]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.files.migrations :as fmg]
[app.common.logging :as l]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct] [app.common.time :as ct]
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db] [app.db :as db]
[app.db.sql :as-alias sql] [app.db.sql :as-alias sql]
[app.features.fdata :as feat.fdata] [app.features.file-snapshots :as fsnap]
[app.features.file-migrations :refer [reset-migrations!]] [app.features.logical-deletion :as ldel]
[app.main :as-alias main] [app.main :as-alias main]
[app.msgbus :as mbus] [app.msgbus :as mbus]
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.commands.files :as files] [app.rpc.commands.files :as files]
[app.rpc.commands.teams :as teams]
[app.rpc.doc :as-alias doc] [app.rpc.doc :as-alias doc]
[app.rpc.quotes :as quotes] [app.rpc.quotes :as quotes]
[app.storage :as sto] [app.util.services :as sv]))
[app.util.blob :as blob]
[app.util.services :as sv]
[cuerdas.core :as str]))
(defn decode-row
[{:keys [migrations] :as row}]
(when row
(cond-> row
(some? migrations)
(assoc :migrations (db/decode-pgarray migrations)))))
(def sql:get-file-snapshots
"WITH changes AS (
SELECT id, label, revn, created_at, created_by, profile_id, locked_by
FROM file_change
WHERE file_id = ?
AND data IS NOT NULL
AND (deleted_at IS NULL OR deleted_at > now())
), versions AS (
(SELECT * FROM changes WHERE created_by = 'system' LIMIT 1000)
UNION ALL
(SELECT * FROM changes WHERE created_by != 'system' LIMIT 1000)
)
SELECT * FROM versions
ORDER BY created_at DESC;")
(defn get-file-snapshots
[conn file-id]
(db/exec! conn [sql:get-file-snapshots file-id]))
(def ^:private schema:get-file-snapshots (def ^:private schema:get-file-snapshots
[:map {:title "get-file-snapshots"} [:map {:title "get-file-snapshots"}
@@ -65,73 +33,7 @@
[cfg {:keys [::rpc/profile-id file-id] :as params}] [cfg {:keys [::rpc/profile-id file-id] :as params}]
(db/run! cfg (fn [{:keys [::db/conn]}] (db/run! cfg (fn [{:keys [::db/conn]}]
(files/check-read-permissions! conn profile-id file-id) (files/check-read-permissions! conn profile-id file-id)
(get-file-snapshots conn file-id)))) (fsnap/get-visible-snapshots conn file-id))))
(defn- generate-snapshot-label
[]
(let [ts (-> (ct/now)
(ct/format-inst)
(str/replace #"[T:\.]" "-")
(str/rtrim "Z"))]
(str "snapshot-" ts)))
(defn create-file-snapshot!
[cfg file & {:keys [label created-by deleted-at profile-id]
:or {deleted-at :default
created-by :system}}]
(assert (#{:system :user :admin} created-by)
"expected valid keyword for created-by")
(let [created-by
(name created-by)
deleted-at
(cond
(= deleted-at :default)
(ct/plus (ct/now) (cf/get-deletion-delay))
(ct/inst? deleted-at)
deleted-at
:else
nil)
label
(or label (generate-snapshot-label))
snapshot-id
(uuid/next)
data
(blob/encode (:data file))
features
(into-array (:features file))
migrations
(into-array (:migrations file))]
(l/dbg :hint "creating file snapshot"
:file-id (str (:id file))
:id (str snapshot-id)
:label label)
(db/insert! cfg :file-change
{:id snapshot-id
:revn (:revn file)
:data data
:version (:version file)
:features features
:migrations migrations
:profile-id profile-id
:file-id (:id file)
:label label
:deleted-at deleted-at
:created-by created-by}
{::db/return-keys false})
{:id snapshot-id :label label}))
(def ^:private schema:create-file-snapshot (def ^:private schema:create-file-snapshot
[:map [:map
@@ -144,7 +46,7 @@
::db/transaction true} ::db/transaction true}
[{:keys [::db/conn] :as cfg} {:keys [::rpc/profile-id file-id label]}] [{:keys [::db/conn] :as cfg} {:keys [::rpc/profile-id file-id label]}]
(files/check-edition-permissions! conn profile-id file-id) (files/check-edition-permissions! conn profile-id file-id)
(let [file (bfc/get-file cfg file-id) (let [file (bfc/get-file cfg file-id :realize? true)
project (db/get-by-id cfg :project (:project-id file))] project (db/get-by-id cfg :project (:project-id file))]
(-> cfg (-> cfg
@@ -155,96 +57,10 @@
(quotes/check! {::quotes/id ::quotes/snapshots-per-file} (quotes/check! {::quotes/id ::quotes/snapshots-per-file}
{::quotes/id ::quotes/snapshots-per-team})) {::quotes/id ::quotes/snapshots-per-team}))
(create-file-snapshot! cfg file (fsnap/create! cfg file
{:label label {:label label
:profile-id profile-id :profile-id profile-id
:created-by :user}))) :created-by "user"})))
(defn restore-file-snapshot!
[{:keys [::db/conn ::mbus/msgbus] :as cfg} file-id snapshot-id]
(let [storage (sto/resolve cfg {::db/reuse-conn true})
file (files/get-minimal-file conn file-id {::db/for-update true})
vern (rand-int Integer/MAX_VALUE)
snapshot (some->> (db/get* conn :file-change
{:file-id file-id
:id snapshot-id}
{::db/for-share true})
(feat.fdata/resolve-file-data cfg)
(decode-row))
;; If snapshot has tracked applied migrations, we reuse them,
;; if not we take a safest set of migrations as starting
;; point. This is because, at the time of implementing
;; snapshots, migrations were not taken into account so we
;; need to make this backward compatible in some way.
file (assoc file :migrations
(or (:migrations snapshot)
(fmg/generate-migrations-from-version 67)))]
(when-not snapshot
(ex/raise :type :not-found
:code :snapshot-not-found
:hint "unable to find snapshot with the provided label"
:snapshot-id snapshot-id
:file-id file-id))
(when-not (:data snapshot)
(ex/raise :type :validation
:code :snapshot-without-data
:hint "snapshot has no data"
:label (:label snapshot)
:file-id file-id))
(l/dbg :hint "restoring snapshot"
:file-id (str file-id)
:label (:label snapshot)
:snapshot-id (str (:id snapshot)))
;; If the file was already offloaded, on restoring the snapshot we
;; are going to replace the file data, so we need to touch the old
;; referenced storage object and avoid possible leaks
(when (feat.fdata/offloaded? file)
(sto/touch-object! storage (:data-ref-id file)))
;; In the same way, on reseting the file data, we need to restore
;; the applied migrations on the moment of taking the snapshot
(reset-migrations! conn file)
(db/update! conn :file
{:data (:data snapshot)
:revn (inc (:revn file))
:vern vern
:version (:version snapshot)
:data-backend nil
:data-ref-id nil
:has-media-trimmed false
:features (:features snapshot)}
{:id file-id})
;; clean object thumbnails
(let [sql (str "update file_tagged_object_thumbnail "
" set deleted_at = now() "
" where file_id=? returning media_id")
res (db/exec! conn [sql file-id])]
(doseq [media-id (into #{} (keep :media-id) res)]
(sto/touch-object! storage media-id)))
;; clean file thumbnails
(let [sql (str "update file_thumbnail "
" set deleted_at = now() "
" where file_id=? returning media_id")
res (db/exec! conn [sql file-id])]
(doseq [media-id (into #{} (keep :media-id) res)]
(sto/touch-object! storage media-id)))
;; Send to the clients a notification to reload the file
(mbus/pub! msgbus
:topic (:id file)
:message {:type :file-restore
:file-id (:id file)
:vern vern})
{:id (:id snapshot)
:label (:label snapshot)}))
(def ^:private schema:restore-file-snapshot (def ^:private schema:restore-file-snapshot
[:map {:title "restore-file-snapshot"} [:map {:title "restore-file-snapshot"}
@@ -253,88 +69,76 @@
(sv/defmethod ::restore-file-snapshot (sv/defmethod ::restore-file-snapshot
{::doc/added "1.20" {::doc/added "1.20"
::sm/params schema:restore-file-snapshot} ::sm/params schema:restore-file-snapshot
[cfg {:keys [::rpc/profile-id file-id id] :as params}] ::db/transaction true}
(db/tx-run! cfg [{:keys [::db/conn ::mbus/msgbus] :as cfg} {:keys [::rpc/profile-id file-id id] :as params}]
(fn [{:keys [::db/conn] :as cfg}]
(files/check-edition-permissions! conn profile-id file-id) (files/check-edition-permissions! conn profile-id file-id)
(let [file (bfc/get-file cfg file-id)] (let [file (bfc/get-file cfg file-id)
(create-file-snapshot! cfg file team (teams/get-team conn
:profile-id profile-id
:file-id file-id)
delay (ldel/get-deletion-delay team)]
(fsnap/create! cfg file
{:profile-id profile-id {:profile-id profile-id
:created-by :system}) :deleted-at (ct/in-future delay)
(restore-file-snapshot! cfg file-id id))))) :created-by "system"})
(let [vern (fsnap/restore! cfg file-id id)]
;; Send to the clients a notification to reload the file
(mbus/pub! msgbus
:topic (:id file)
:message {:type :file-restore
:file-id (:id file)
:vern vern})
nil)))
(def ^:private schema:update-file-snapshot (def ^:private schema:update-file-snapshot
[:map {:title "update-file-snapshot"} [:map {:title "update-file-snapshot"}
[:id ::sm/uuid] [:id ::sm/uuid]
[:label ::sm/text]]) [:label ::sm/text]])
(defn- update-file-snapshot!
[conn snapshot-id label]
(-> (db/update! conn :file-change
{:label label
:created-by "user"
:deleted-at nil}
{:id snapshot-id}
{::db/return-keys true})
(dissoc :data :features :migrations)))
(defn- get-snapshot
"Get a minimal snapshot from database and lock for update"
[conn id]
(db/get conn :file-change
{:id id}
{::sql/columns [:id :file-id :created-by :deleted-at :profile-id :locked-by]
::db/for-update true}))
(sv/defmethod ::update-file-snapshot (sv/defmethod ::update-file-snapshot
{::doc/added "1.20" {::doc/added "1.20"
::sm/params schema:update-file-snapshot} ::sm/params schema:update-file-snapshot
[cfg {:keys [::rpc/profile-id id label]}] ::db/transaction true}
(db/tx-run! cfg [{:keys [::db/conn]} {:keys [::rpc/profile-id id label]}]
(fn [{:keys [::db/conn]}] (let [snapshot (fsnap/get-minimal-snapshot conn id)]
(let [snapshot (get-snapshot conn id)]
(files/check-edition-permissions! conn profile-id (:file-id snapshot)) (files/check-edition-permissions! conn profile-id (:file-id snapshot))
(update-file-snapshot! conn id label))))) (fsnap/update! conn (assoc snapshot :label label))))
(def ^:private schema:remove-file-snapshot (def ^:private schema:remove-file-snapshot
[:map {:title "remove-file-snapshot"} [:map {:title "remove-file-snapshot"}
[:id ::sm/uuid]]) [:id ::sm/uuid]])
(defn- delete-file-snapshot!
[conn snapshot-id]
(db/update! conn :file-change
{:deleted-at (ct/now)}
{:id snapshot-id}
{::db/return-keys false})
nil)
(sv/defmethod ::delete-file-snapshot (sv/defmethod ::delete-file-snapshot
{::doc/added "1.20" {::doc/added "1.20"
::sm/params schema:remove-file-snapshot} ::sm/params schema:remove-file-snapshot
[cfg {:keys [::rpc/profile-id id]}] ::db/transaction true}
(db/tx-run! cfg [{:keys [::db/conn]} {:keys [::rpc/profile-id id]}]
(fn [{:keys [::db/conn]}] (let [snapshot (fsnap/get-minimal-snapshot conn id)]
(let [snapshot (get-snapshot conn id)]
(files/check-edition-permissions! conn profile-id (:file-id snapshot)) (files/check-edition-permissions! conn profile-id (:file-id snapshot))
(when (not= (:created-by snapshot) "user") (when (not= (:created-by snapshot) "user")
(ex/raise :type :validation (ex/raise :type :validation
:code :system-snapshots-cant-be-deleted :code :system-snapshots-cant-be-deleted
:file-id (:file-id snapshot)
:snapshot-id id :snapshot-id id
:profile-id profile-id)) :profile-id profile-id))
;; Check if version is locked by someone else (when (and (some? (:locked-by snapshot))
(when (and (:locked-by snapshot)
(not= (:locked-by snapshot) profile-id)) (not= (:locked-by snapshot) profile-id))
(ex/raise :type :validation (ex/raise :type :validation
:code :snapshot-is-locked :code :snapshot-is-locked
:hint "Cannot delete a locked version" :file-id (:file-id snapshot)
:snapshot-id id :snapshot-id id
:profile-id profile-id :profile-id profile-id))
:locked-by (:locked-by snapshot)))
(delete-file-snapshot! conn id))))) (let [team (teams/get-team conn
:profile-id profile-id
:file-id (:file-id snapshot))
delay (ldel/get-deletion-delay team)]
(fsnap/delete! conn (assoc snapshot :deleted-at (ct/in-future delay))))))
;;; Lock/unlock version endpoints ;;; Lock/unlock version endpoints
@@ -342,21 +146,12 @@
[:map {:title "lock-file-snapshot"} [:map {:title "lock-file-snapshot"}
[:id ::sm/uuid]]) [:id ::sm/uuid]])
(defn- lock-file-snapshot!
[conn snapshot-id profile-id]
(db/update! conn :file-change
{:locked-by profile-id}
{:id snapshot-id}
{::db/return-keys false})
nil)
(sv/defmethod ::lock-file-snapshot (sv/defmethod ::lock-file-snapshot
{::doc/added "1.20" {::doc/added "1.20"
::sm/params schema:lock-file-snapshot} ::sm/params schema:lock-file-snapshot
[cfg {:keys [::rpc/profile-id id]}] ::db/transaction true}
(db/tx-run! cfg [{:keys [::db/conn]} {:keys [::rpc/profile-id id]}]
(fn [{:keys [::db/conn]}] (let [snapshot (fsnap/get-minimal-snapshot conn id)]
(let [snapshot (get-snapshot conn id)]
(files/check-edition-permissions! conn profile-id (:file-id snapshot)) (files/check-edition-permissions! conn profile-id (:file-id snapshot))
(when (not= (:created-by snapshot) "user") (when (not= (:created-by snapshot) "user")
@@ -384,27 +179,18 @@
:profile-id profile-id :profile-id profile-id
:locked-by (:locked-by snapshot))) :locked-by (:locked-by snapshot)))
(lock-file-snapshot! conn id profile-id))))) (fsnap/lock-by! conn id profile-id)))
(def ^:private schema:unlock-file-snapshot (def ^:private schema:unlock-file-snapshot
[:map {:title "unlock-file-snapshot"} [:map {:title "unlock-file-snapshot"}
[:id ::sm/uuid]]) [:id ::sm/uuid]])
(defn- unlock-file-snapshot!
[conn snapshot-id]
(db/update! conn :file-change
{:locked-by nil}
{:id snapshot-id}
{::db/return-keys false})
nil)
(sv/defmethod ::unlock-file-snapshot (sv/defmethod ::unlock-file-snapshot
{::doc/added "1.20" {::doc/added "1.20"
::sm/params schema:unlock-file-snapshot} ::sm/params schema:unlock-file-snapshot
[cfg {:keys [::rpc/profile-id id]}] ::db/transaction true}
(db/tx-run! cfg [{:keys [::db/conn]} {:keys [::rpc/profile-id id]}]
(fn [{:keys [::db/conn]}] (let [snapshot (fsnap/get-minimal-snapshot conn id)]
(let [snapshot (get-snapshot conn id)]
(files/check-edition-permissions! conn profile-id (:file-id snapshot)) (files/check-edition-permissions! conn profile-id (:file-id snapshot))
(when (not= (:created-by snapshot) "user") (when (not= (:created-by snapshot) "user")
@@ -431,4 +217,4 @@
:snapshot-id id :snapshot-id id
:profile-id profile-id)) :profile-id profile-id))
(unlock-file-snapshot! conn id))))) (fsnap/unlock! conn id)))

View File

@@ -1,160 +0,0 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.rpc.commands.files-temp
(:require
[app.common.exceptions :as ex]
[app.common.features :as cfeat]
[app.common.files.changes :as cpc]
[app.common.schema :as sm]
[app.common.time :as ct]
[app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db]
[app.db.sql :as sql]
[app.features.fdata :as fdata]
[app.loggers.audit :as audit]
[app.rpc :as-alias rpc]
[app.rpc.commands.files :as files]
[app.rpc.commands.files-create :as files.create]
[app.rpc.commands.files-update :as-alias files.update]
[app.rpc.commands.projects :as projects]
[app.rpc.commands.teams :as teams]
[app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph]
[app.util.blob :as blob]
[app.util.pointer-map :as pmap]
[app.util.services :as sv]
[clojure.set :as set]))
;; --- MUTATION COMMAND: create-temp-file
(def ^:private schema:create-temp-file
[:map {:title "create-temp-file"}
[:name [:string {:max 250}]]
[:project-id ::sm/uuid]
[:id {:optional true} ::sm/uuid]
[:is-shared ::sm/boolean]
[:features ::cfeat/features]
[:create-page ::sm/boolean]])
(sv/defmethod ::create-temp-file
{::doc/added "1.17"
::doc/module :files
::sm/params schema:create-temp-file
::db/transaction true}
[{:keys [::db/conn] :as cfg} {:keys [::rpc/profile-id project-id] :as params}]
(projects/check-edition-permissions! conn profile-id project-id)
(let [team (teams/get-team conn :profile-id profile-id :project-id project-id)
;; When we create files, we only need to respect the team
;; features, because some features can be enabled
;; globally, but the team is still not migrated properly.
input-features
(:features params #{})
;; If the imported project doesn't contain v2 we need to remove it
team-features
(cond-> (cfeat/get-team-enabled-features cf/flags team)
(not (contains? input-features "components/v2"))
(disj "components/v2"))
;; We also include all no migration features declared by
;; client; that enables the ability to enable a runtime
;; feature on frontend and make it permanent on file
features
(-> input-features
(set/intersection cfeat/no-migration-features)
(set/union team-features))
params
(-> params
(assoc :profile-id profile-id)
(assoc :deleted-at (ct/in-future {:days 1}))
(assoc :features features))]
(files.create/create-file cfg params)))
;; --- MUTATION COMMAND: update-temp-file
(def ^:private schema:update-temp-file
[:map {:title "update-temp-file"}
[:changes [:vector cpc/schema:change]]
[:revn [::sm/int {:min 0}]]
[:session-id ::sm/uuid]
[:id ::sm/uuid]])
(sv/defmethod ::update-temp-file
{::doc/added "1.17"
::doc/module :files
::sm/params schema:update-temp-file}
[cfg {:keys [::rpc/profile-id session-id id revn changes] :as params}]
(db/tx-run! cfg (fn [{:keys [::db/conn]}]
(db/insert! conn :file-change
{:id (uuid/next)
:session-id session-id
:profile-id profile-id
:created-at (ct/now)
:file-id id
:revn revn
:data nil
:changes (blob/encode changes)})
(rph/with-meta (rph/wrap nil)
{::audit/replace-props {:file-id id
:revn revn}}))))
;; --- MUTATION COMMAND: persist-temp-file
(defn persist-temp-file
[{:keys [::db/conn] :as cfg} {:keys [id] :as params}]
(let [file (files/get-file cfg id
:migrate? false
:lock-for-update? true)]
(when (nil? (:deleted-at file))
(ex/raise :type :validation
:code :cant-persist-already-persisted-file))
(let [changes (->> (db/cursor conn
(sql/select :file-change {:file-id id}
{:order-by [[:revn :asc]]})
{:chunk-size 10})
(sequence (mapcat (comp blob/decode :changes))))
file (update file :data cpc/process-changes changes)
file (if (contains? (:features file) "fdata/objects-map")
(fdata/enable-objects-map file)
file)
file (if (contains? (:features file) "fdata/pointer-map")
(binding [pmap/*tracked* (pmap/create-tracked)]
(let [file (fdata/enable-pointer-map file)]
(fdata/persist-pointers! cfg id)
file))
file)]
;; Delete changes from the changes history
(db/delete! conn :file-change {:file-id id})
(db/update! conn :file
{:deleted-at nil
:revn 1
:data (blob/encode (:data file))}
{:id id})
nil)))
(def ^:private schema:persist-temp-file
[:map {:title "persist-temp-file"}
[:id ::sm/uuid]])
(sv/defmethod ::persist-temp-file
{::doc/added "1.17"
::doc/module :files
::sm/params schema:persist-temp-file}
[cfg {:keys [::rpc/profile-id id] :as params}]
(db/tx-run! cfg (fn [{:keys [::db/conn] :as cfg}]
(files/check-edition-permissions! conn profile-id id)
(persist-temp-file cfg params))))

View File

@@ -6,6 +6,7 @@
(ns app.rpc.commands.files-thumbnails (ns app.rpc.commands.files-thumbnails
(:require (:require
[app.binfile.common :as bfc]
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm] [app.common.data.macros :as dm]
[app.common.features :as cfeat] [app.common.features :as cfeat]
@@ -202,8 +203,8 @@
:profile-id profile-id :profile-id profile-id
:file-id file-id) :file-id file-id)
file (files/get-file cfg file-id file (bfc/get-file cfg file-id
:preload-pointers? true :realize? true
:read-only? true)] :read-only? true)]
(-> (cfeat/get-team-enabled-features cf/flags team) (-> (cfeat/get-team-enabled-features cf/flags team)
@@ -339,6 +340,7 @@
data (-> (sto/content path) data (-> (sto/content path)
(sto/wrap-with-hash hash)) (sto/wrap-with-hash hash))
tnow (ct/now) tnow (ct/now)
media (sto/put-object! storage media (sto/put-object! storage
{::sto/content data {::sto/content data
::sto/deduplicate? true ::sto/deduplicate? true

View File

@@ -19,27 +19,25 @@
[app.common.uuid :as uuid] [app.common.uuid :as uuid]
[app.config :as cf] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.features.fdata :as feat.fdata] [app.features.fdata :as fdata]
[app.features.file-migrations :as feat.fmigr] [app.features.file-snapshots :as fsnap]
[app.features.logical-deletion :as ldel] [app.features.logical-deletion :as ldel]
[app.http.errors :as errors] [app.http.errors :as errors]
[app.loggers.audit :as audit] [app.loggers.audit :as audit]
[app.loggers.webhooks :as webhooks] [app.loggers.webhooks :as webhooks]
[app.metrics :as mtx] [app.metrics :as mtx]
[app.msgbus :as mbus] [app.msgbus :as mbus]
[app.redis :as rds]
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.climit :as climit] [app.rpc.climit :as climit]
[app.rpc.commands.files :as files] [app.rpc.commands.files :as files]
[app.rpc.commands.teams :as teams] [app.rpc.commands.teams :as teams]
[app.rpc.doc :as-alias doc] [app.rpc.doc :as-alias doc]
[app.rpc.helpers :as rph] [app.rpc.helpers :as rph]
[app.storage :as sto]
[app.util.blob :as blob] [app.util.blob :as blob]
[app.util.pointer-map :as pmap] [app.util.pointer-map :as pmap]
[app.util.services :as sv] [app.util.services :as sv]
[app.worker :as wrk] [clojure.set :as set]))
[clojure.set :as set]
[promesa.exec :as px]))
(declare ^:private get-lagged-changes) (declare ^:private get-lagged-changes)
(declare ^:private send-notifications!) (declare ^:private send-notifications!)
@@ -47,6 +45,7 @@
(declare ^:private update-file*) (declare ^:private update-file*)
(declare ^:private process-changes-and-validate) (declare ^:private process-changes-and-validate)
(declare ^:private take-snapshot?) (declare ^:private take-snapshot?)
(declare ^:private invalidate-caches!)
;; PUBLIC API; intended to be used outside of this module ;; PUBLIC API; intended to be used outside of this module
(declare update-file!) (declare update-file!)
@@ -129,14 +128,15 @@
::sm/params schema:update-file ::sm/params schema:update-file
::sm/result schema:update-file-result ::sm/result schema:update-file-result
::doc/module :files ::doc/module :files
::doc/added "1.17"} ::doc/added "1.17"
[{:keys [::mtx/metrics] :as cfg} ::db/transaction true}
[{:keys [::mtx/metrics ::db/conn] :as cfg}
{:keys [::rpc/profile-id id changes changes-with-metadata] :as params}] {:keys [::rpc/profile-id id changes changes-with-metadata] :as params}]
(db/tx-run! cfg (fn [{:keys [::db/conn] :as cfg}]
(files/check-edition-permissions! conn profile-id id) (files/check-edition-permissions! conn profile-id id)
(db/xact-lock! conn id) (db/xact-lock! conn id)
(let [file (get-file conn id) (let [file (get-file cfg id)
team (teams/get-team conn team (teams/get-team conn
:profile-id profile-id :profile-id profile-id
:team-id (:team-id file)) :team-id (:team-id file))
@@ -176,8 +176,8 @@
:context {:incoming-revn (:revn params) :context {:incoming-revn (:revn params)
:stored-revn (:revn file)})) :stored-revn (:revn file)}))
;; When newly computed features does not match exactly with ;; When newly computed features does not match exactly with the
;; the features defined on team row, we update it ;; features defined on team row, we update it
(when-let [features (-> features (when-let [features (-> features
(set/difference (:features team)) (set/difference (:features team))
(set/difference cfeat/no-team-inheritable-features) (set/difference cfeat/no-team-inheritable-features)
@@ -191,6 +191,7 @@
{:id (:id team)} {:id (:id team)}
{::db/return-keys false}))) {::db/return-keys false})))
(mtx/run! metrics {:id :update-file-changes :inc (count changes)}) (mtx/run! metrics {:id :update-file-changes :inc (count changes)})
(binding [l/*context* (some-> (meta params) (binding [l/*context* (some-> (meta params)
@@ -198,7 +199,7 @@
(errors/request->context))] (errors/request->context))]
(-> (update-file* cfg params) (-> (update-file* cfg params)
(rph/with-defer #(let [elapsed (tpoint)] (rph/with-defer #(let [elapsed (tpoint)]
(l/trace :hint "update-file" :time (ct/format-duration elapsed)))))))))) (l/trace :hint "update-file" :time (ct/format-duration elapsed))))))))
(defn- update-file* (defn- update-file*
"Internal function, part of the update-file process, that encapsulates "Internal function, part of the update-file process, that encapsulates
@@ -208,31 +209,41 @@
Follow the inner implementation to `update-file-data!` function. Follow the inner implementation to `update-file-data!` function.
Only intended for internal use on this module." Only intended for internal use on this module."
[{:keys [::db/conn ::wrk/executor ::timestamp] :as cfg} [{:keys [::db/conn ::timestamp] :as cfg}
{:keys [profile-id file team features changes session-id skip-validate] :as params}] {:keys [profile-id file team features changes session-id skip-validate] :as params}]
(let [;; Retrieve the file data (binding [pmap/*tracked* (pmap/create-tracked)
file (feat.fmigr/resolve-applied-migrations cfg file) pmap/*load-fn* (partial fdata/load-pointer cfg (:id file))]
file (feat.fdata/resolve-file-data cfg file)
file (assoc file :features (let [file (assoc file :features
(-> features (-> features
(set/difference cfeat/frontend-only-features) (set/difference cfeat/frontend-only-features)
(set/union (:features file))))] (set/union (:features file))))
;; We create a new lexycal scope for clearly delimit the result of ;; We need to preserve the original revn for the response
;; executing this update file operation and all its side effects revn
(let [file (px/invoke! executor (get file :revn)
(fn []
;; Process the file data on separated thread for avoid to do file
;; the CPU intensive operation on vthread.
(binding [cfeat/*current* features (binding [cfeat/*current* features
cfeat/*previous* (:features file)] cfeat/*previous* (:features file)]
(update-file-data! cfg file (update-file-data! cfg file
process-changes-and-validate process-changes-and-validate
changes skip-validate))))] changes skip-validate))
(feat.fmigr/upsert-migrations! conn file) deleted-at
(persist-file! cfg file) (ct/plus timestamp (ct/duration {:hours 1}))]
(when-let [file (::snapshot file)]
(let [deleted-at (ct/plus timestamp (ldel/get-deletion-delay team))
label (str "internal/snapshot/" revn)]
(fsnap/create! cfg file
{:label label
:created-by "system"
:deleted-at deleted-at
:profile-id profile-id
:session-id session-id})))
;; Insert change (xlog) with deleted_at in a future data for ;; Insert change (xlog) with deleted_at in a future data for
;; make them automatically eleggible for GC once they expires ;; make them automatically eleggible for GC once they expires
@@ -242,87 +253,71 @@
:profile-id profile-id :profile-id profile-id
:created-at timestamp :created-at timestamp
:updated-at timestamp :updated-at timestamp
:deleted-at (if (::snapshot-data file) :deleted-at deleted-at
(ct/plus timestamp (ldel/get-deletion-delay team))
(ct/plus timestamp (ct/duration {:hours 1})))
:file-id (:id file) :file-id (:id file)
:revn (:revn file) :revn (:revn file)
:version (:version file) :version (:version file)
:features (:features file) :features (into-array (:features file))
:label (::snapshot-label file)
:data (::snapshot-data file)
:changes (blob/encode changes)} :changes (blob/encode changes)}
{::db/return-keys false}) {::db/return-keys false})
(persist-file! cfg file)
(when (contains? cf/flags :redis-cache)
(invalidate-caches! cfg file))
;; Send asynchronous notifications ;; Send asynchronous notifications
(send-notifications! cfg params file)) (send-notifications! cfg params file)
(when (feat.fdata/offloaded? file) (with-meta {:revn revn :lagged (get-lagged-changes conn params)}
(let [storage (sto/resolve cfg ::db/reuse-conn true)] {::audit/replace-props
(some->> (:data-ref-id file) (sto/touch-object! storage))))
(let [response {:revn (:revn file)
:lagged (get-lagged-changes conn params)}]
(vary-meta response assoc ::audit/replace-props
{:id (:id file) {:id (:id file)
:name (:name file) :name (:name file)
:features (:features file) :features (:features file)
:project-id (:project-id file) :project-id (:project-id file)
:team-id (:team-id file)})))) :team-id (:team-id file)}}))))
(defn update-file!
"A public api that allows apply a transformation to a file with all context setup."
[{:keys [::db/conn] :as cfg} file-id update-fn & args]
(let [file (get-file cfg file-id)
file (apply update-file-data! cfg file update-fn args)]
(feat.fmigr/upsert-migrations! conn file)
(persist-file! cfg file)))
(def ^:private sql:get-file
"SELECT f.*, p.team_id
FROM file AS f
JOIN project AS p ON (p.id = f.project_id)
WHERE f.id = ?
AND (f.deleted_at IS NULL OR
f.deleted_at > now())
FOR KEY SHARE")
(defn get-file (defn get-file
"Get not-decoded file, only decodes the features set." "Get not-decoded file, only decodes the features set."
[conn id] [cfg id]
(let [file (db/exec-one! conn [sql:get-file id])] (bfc/get-file cfg id :decode? false :lock-for-share? true))
(when-not file
(ex/raise :type :not-found
:code :object-not-found
:hint (format "file with id '%s' does not exists" id)))
(update file :features db/decode-pgarray #{})))
(defn persist-file! (defn persist-file!
"Function responsible of persisting already encoded file. Should be "Function responsible of persisting already encoded file. Should be
used together with `get-file` and `update-file-data!`. used together with `get-file` and `update-file-data!`.
It also updates the project modified-at attr." It also updates the project modified-at attr."
[{:keys [::db/conn ::timestamp]} file] [{:keys [::db/conn ::timestamp] :as cfg} file]
(let [;; The timestamp can be nil because this function is also (let [;; The timestamp can be nil because this function is also
;; intended to be used outside of this module ;; intended to be used outside of this module
modified-at (or timestamp (ct/now))] modified-at
(or timestamp (ct/now))
file
(-> file
(dissoc ::snapshot)
(assoc :modified-at modified-at)
(assoc :has-media-trimmed false))]
(db/update! conn :project (db/update! conn :project
{:modified-at modified-at} {:modified-at modified-at}
{:id (:project-id file)} {:id (:project-id file)}
{::db/return-keys false}) {::db/return-keys false})
(db/update! conn :file (bfc/update-file! cfg file)))
{:revn (:revn file)
:data (:data file) (defn- invalidate-caches!
:version (:version file) [cfg {:keys [id] :as file}]
:features (:features file) (rds/run! cfg (fn [{:keys [::rds/conn]}]
:data-backend nil (let [key (str files/file-summary-cache-key-prefix id)]
:data-ref-id nil (rds/del conn key)))))
:modified-at modified-at
:has-media-trimmed false} (defn- attach-snapshot
{:id (:id file)} "Attach snapshot data to the file. This should be called before the
{::db/return-keys false}))) upcoming file operations are applied to the file."
[cfg migrated? file]
(let [snapshot (if migrated? file (fdata/realize cfg file))]
(assoc file ::snapshot snapshot)))
(defn- update-file-data! (defn- update-file-data!
"Perform a file data transformation in with all update context setup. "Perform a file data transformation in with all update context setup.
@@ -334,52 +329,35 @@
fdata/pointer-map modified fragments." fdata/pointer-map modified fragments."
[cfg {:keys [id] :as file} update-fn & args] [cfg {:keys [id] :as file} update-fn & args]
(binding [pmap/*tracked* (pmap/create-tracked)
pmap/*load-fn* (partial feat.fdata/load-pointer cfg id)]
(let [file (update file :data (fn [data] (let [file (update file :data (fn [data]
(-> data (-> data
(blob/decode) (blob/decode)
(assoc :id (:id file))))) (assoc :id id))))
libs (delay (bfc/get-resolved-file-libraries cfg file)) libs (delay (bfc/get-resolved-file-libraries cfg file))
;; For avoid unnecesary overhead of creating multiple pointers need-migration?
;; and handly internally with objects map in their worst (fmg/need-migration? file)
;; case (when probably all shapes and all pointers will be
;; readed in any case), we just realize/resolve them before
;; applying the migration to the file
file (if (fmg/need-migration? file)
(-> file
(update :data feat.fdata/process-pointers deref)
(update :data feat.fdata/process-objects (partial into {}))
(fmg/migrate-file libs))
file)
file (apply update-fn cfg file args) take-snapshot?
(take-snapshot? file)
;; TODO: reuse operations if file is migrated ;; For avoid unnecesary overhead of creating multiple
;; TODO: move encoding to a separated thread ;; pointers and handly internally with objects map in their
file (if (take-snapshot? file) ;; worst case (when probably all shapes and all pointers
(let [tpoint (ct/tpoint) ;; will be readed in any case), we just realize/resolve them
snapshot (-> (:data file) ;; before applying the migration to the file
(feat.fdata/process-pointers deref) file
(feat.fdata/process-objects (partial into {})) (cond-> file
(blob/encode)) ;; need-migration?
elapsed (tpoint) ;; (->> (fdata/realize cfg))
label (str "internal/snapshot/" (:revn file))]
(l/trc :hint "take snapshot" need-migration?
:file-id (str (:id file)) (fmg/migrate-file libs)
:revn (:revn file)
:label label
:elapsed (ct/format-duration elapsed))
(-> file take-snapshot?
(assoc ::snapshot-data snapshot) (->> (attach-snapshot cfg need-migration?)))]
(assoc ::snapshot-label label)))
file)]
(bfc/encode-file cfg file))))
(apply update-fn cfg file args)))
(defn- soft-validate-file-schema! (defn- soft-validate-file-schema!
[file] [file]
@@ -468,8 +446,9 @@
(defn- get-lagged-changes (defn- get-lagged-changes
[conn {:keys [id revn] :as params}] [conn {:keys [id revn] :as params}]
(->> (db/exec! conn [sql:lagged-changes id revn]) (->> (db/exec! conn [sql:lagged-changes id revn])
(map files/decode-row) (filter :changes)
(vec))) (mapv (fn [row]
(update row :changes blob/decode)))))
(defn- send-notifications! (defn- send-notifications!
[cfg {:keys [team changes session-id] :as params} file] [cfg {:keys [team changes session-id] :as params} file]

View File

@@ -26,9 +26,7 @@
[app.rpc.helpers :as rph] [app.rpc.helpers :as rph]
[app.rpc.quotes :as quotes] [app.rpc.quotes :as quotes]
[app.storage :as sto] [app.storage :as sto]
[app.util.services :as sv] [app.util.services :as sv]))
[app.worker :as-alias wrk]
[promesa.exec :as px]))
(def valid-weight #{100 200 300 400 500 600 700 800 900 950}) (def valid-weight #{100 200 300 400 500 600 700 800 900 950})
(def valid-style #{"normal" "italic"}) (def valid-style #{"normal" "italic"})
@@ -105,7 +103,7 @@
(create-font-variant cfg (assoc params :profile-id profile-id))))) (create-font-variant cfg (assoc params :profile-id profile-id)))))
(defn create-font-variant (defn create-font-variant
[{:keys [::sto/storage ::db/conn ::wrk/executor]} {:keys [data] :as params}] [{:keys [::sto/storage ::db/conn]} {:keys [data] :as params}]
(letfn [(generate-missing! [data] (letfn [(generate-missing! [data]
(let [data (media/run {:cmd :generate-fonts :input data})] (let [data (media/run {:cmd :generate-fonts :input data})]
(when (and (not (contains? data "font/otf")) (when (and (not (contains? data "font/otf"))
@@ -157,7 +155,7 @@
:otf-file-id (:id otf) :otf-file-id (:id otf)
:ttf-file-id (:id ttf)}))] :ttf-file-id (:id ttf)}))]
(let [data (px/invoke! executor (partial generate-missing! data)) (let [data (generate-missing! data)
assets (persist-fonts-files! data) assets (persist-fonts-files! data)
result (insert-font-variant! assets)] result (insert-font-variant! assets)]
(vary-meta result assoc ::audit/replace-props (update params :data (comp vec keys)))))) (vary-meta result assoc ::audit/replace-props (update params :data (comp vec keys))))))

View File

@@ -38,7 +38,7 @@
::doc/added "1.15" ::doc/added "1.15"
::doc/module :auth ::doc/module :auth
::sm/params schema:login-with-ldap} ::sm/params schema:login-with-ldap}
[{:keys [::setup/props ::ldap/provider] :as cfg} params] [{:keys [::ldap/provider] :as cfg} params]
(when-not provider (when-not provider
(ex/raise :type :restriction (ex/raise :type :restriction
:code :ldap-not-initialized :code :ldap-not-initialized
@@ -60,11 +60,11 @@
;; user comes from team-invitation process; in this case, ;; user comes from team-invitation process; in this case,
;; regenerate token and send back to the user a new invitation ;; regenerate token and send back to the user a new invitation
;; token (and mark current session as logged). ;; token (and mark current session as logged).
(let [claims (tokens/verify props {:token token :iss :team-invitation}) (let [claims (tokens/verify cfg {:token token :iss :team-invitation})
claims (assoc claims claims (assoc claims
:member-id (:id profile) :member-id (:id profile)
:member-email (:email profile)) :member-email (:email profile))
token (tokens/generate props claims)] token (tokens/generate cfg claims)]
(-> {:invitation-token token} (-> {:invitation-token token}
(rph/with-transform (session/create-fn cfg (:id profile))) (rph/with-transform (session/create-fn cfg (:id profile)))
(rph/with-meta {::audit/props (:props profile) (rph/with-meta {::audit/props (:props profile)

View File

@@ -28,16 +28,14 @@
[app.setup :as-alias setup] [app.setup :as-alias setup]
[app.setup.templates :as tmpl] [app.setup.templates :as tmpl]
[app.storage.tmp :as tmp] [app.storage.tmp :as tmp]
[app.util.services :as sv] [app.util.services :as sv]))
[app.worker :as-alias wrk]
[promesa.exec :as px]))
;; --- COMMAND: Duplicate File ;; --- COMMAND: Duplicate File
(defn duplicate-file (defn duplicate-file
[{:keys [::db/conn ::bfc/timestamp] :as cfg} {:keys [profile-id file-id name reset-shared-flag] :as params}] [{:keys [::db/conn ::bfc/timestamp] :as cfg} {:keys [profile-id file-id name reset-shared-flag] :as params}]
(let [;; We don't touch the original file on duplication (let [;; We don't touch the original file on duplication
file (bfc/get-file cfg file-id) file (bfc/get-file cfg file-id :realize? true)
project-id (:project-id file) project-id (:project-id file)
file (-> file file (-> file
(update :id bfc/lookup-index) (update :id bfc/lookup-index)
@@ -313,15 +311,14 @@
;; Update the modification date of the all affected projects ;; Update the modification date of the all affected projects
;; ensuring that the destination project is the most recent one. ;; ensuring that the destination project is the most recent one.
(doseq [project-id (into (list project-id) source)] (loop [project-ids (into (list project-id) source)
modified-at (ct/now)]
;; NOTE: as this is executed on virtual thread, sleeping does (when-let [project-id (first project-ids)]
;; not causes major issues, and allows an easy way to set a
;; trully different modification date to each file.
(px/sleep 10)
(db/update! conn :project (db/update! conn :project
{:modified-at (ct/now)} {:modified-at modified-at}
{:id project-id})) {:id project-id})
(recur (rest project-ids)
(ct/plus modified-at 10))))
nil)) nil))
@@ -396,12 +393,7 @@
;; --- COMMAND: Clone Template ;; --- COMMAND: Clone Template
(defn clone-template (defn clone-template
[{:keys [::db/pool ::wrk/executor] :as cfg} {:keys [project-id profile-id] :as params} template] [{:keys [::db/pool] :as cfg} {:keys [project-id profile-id] :as params} template]
;; NOTE: the importation process performs some operations
;; that are not very friendly with virtual threads, and for
;; avoid unexpected blocking of other concurrent operations
;; we dispatch that operation to a dedicated executor.
(let [template (tmp/tempfile-from template (let [template (tmp/tempfile-from template
:prefix "penpot.template." :prefix "penpot.template."
:suffix "" :suffix ""
@@ -419,8 +411,8 @@
(assoc ::bfc/features (cfeat/get-team-enabled-features cf/flags team))) (assoc ::bfc/features (cfeat/get-team-enabled-features cf/flags team)))
result (if (= format :binfile-v3) result (if (= format :binfile-v3)
(px/invoke! executor (partial bf.v3/import-files! cfg)) (bf.v3/import-files! cfg)
(px/invoke! executor (partial bf.v1/import-files! cfg)))] (bf.v1/import-files! cfg))]
(db/tx-run! cfg (db/tx-run! cfg
(fn [{:keys [::db/conn] :as cfg}] (fn [{:keys [::db/conn] :as cfg}]

View File

@@ -24,10 +24,8 @@
[app.storage :as sto] [app.storage :as sto]
[app.storage.tmp :as tmp] [app.storage.tmp :as tmp]
[app.util.services :as sv] [app.util.services :as sv]
[app.worker :as-alias wrk]
[cuerdas.core :as str] [cuerdas.core :as str]
[datoteka.io :as io] [datoteka.io :as io]))
[promesa.exec :as px]))
(def default-max-file-size (def default-max-file-size
(* 1024 1024 10)) ; 10 MiB (* 1024 1024 10)) ; 10 MiB
@@ -153,9 +151,9 @@
(assoc ::image (process-main-image info))))) (assoc ::image (process-main-image info)))))
(defn- create-file-media-object (defn- create-file-media-object
[{:keys [::sto/storage ::db/conn ::wrk/executor] :as cfg} [{:keys [::sto/storage ::db/conn] :as cfg}
{:keys [id file-id is-local name content]}] {:keys [id file-id is-local name content]}]
(let [result (px/invoke! executor (partial process-image content)) (let [result (process-image content)
image (sto/put-object! storage (::image result)) image (sto/put-object! storage (::image result))
thumb (when-let [params (::thumb result)] thumb (when-let [params (::thumb result)]
(sto/put-object! storage params))] (sto/put-object! storage params))]

View File

@@ -30,16 +30,13 @@
[app.tokens :as tokens] [app.tokens :as tokens]
[app.util.services :as sv] [app.util.services :as sv]
[app.worker :as wrk] [app.worker :as wrk]
[cuerdas.core :as str] [cuerdas.core :as str]))
[promesa.exec :as px]))
(declare check-profile-existence!) (declare check-profile-existence!)
(declare decode-row) (declare decode-row)
(declare derive-password)
(declare filter-props) (declare filter-props)
(declare get-profile) (declare get-profile)
(declare strip-private-attrs) (declare strip-private-attrs)
(declare verify-password)
(def schema:props-notifications (def schema:props-notifications
[:map {:title "props-notifications"} [:map {:title "props-notifications"}
@@ -110,7 +107,9 @@
(defn get-profile (defn get-profile
"Get profile by id. Throws not-found exception if no profile found." "Get profile by id. Throws not-found exception if no profile found."
[conn id & {:as opts}] [conn id & {:as opts}]
(-> (db/get-by-id conn :profile id opts) ;; NOTE: We need to set ::db/remove-deleted to false because demo profiles
;; are created with a set deleted-at value
(-> (db/get-by-id conn :profile id (assoc opts ::db/remove-deleted false))
(decode-row))) (decode-row)))
;; --- MUTATION: Update Profile (own) ;; --- MUTATION: Update Profile (own)
@@ -192,7 +191,7 @@
[{:keys [::db/conn] :as cfg} {:keys [profile-id old-password] :as params}] [{:keys [::db/conn] :as cfg} {:keys [profile-id old-password] :as params}]
(let [profile (db/get-by-id conn :profile profile-id ::sql/for-update true)] (let [profile (db/get-by-id conn :profile profile-id ::sql/for-update true)]
(when (and (not= (:password profile) "!") (when (and (not= (:password profile) "!")
(not (:valid (verify-password cfg old-password (:password profile))))) (not (:valid (auth/verify-password old-password (:password profile)))))
(ex/raise :type :validation (ex/raise :type :validation
:code :old-password-not-match)) :code :old-password-not-match))
profile)) profile))
@@ -201,7 +200,7 @@
[{:keys [::db/conn] :as cfg} {:keys [id password] :as profile}] [{:keys [::db/conn] :as cfg} {:keys [id password] :as profile}]
(when-not (db/read-only? conn) (when-not (db/read-only? conn)
(db/update! conn :profile (db/update! conn :profile
{:password (derive-password cfg password)} {:password (auth/derive-password password)}
{:id id}) {:id id})
nil)) nil))
@@ -303,12 +302,11 @@
:content-type (:mtype thumb)})) :content-type (:mtype thumb)}))
(defn upload-photo (defn upload-photo
[{:keys [::sto/storage ::wrk/executor] :as cfg} {:keys [file] :as params}] [{:keys [::sto/storage] :as cfg} {:keys [file] :as params}]
(let [params (-> cfg (let [params (-> cfg
(assoc ::climit/id [[:process-image/by-profile (:profile-id params)] (assoc ::climit/id [[:process-image/by-profile (:profile-id params)]
[:process-image/global]]) [:process-image/global]])
(assoc ::climit/label "upload-photo") (assoc ::climit/label "upload-photo")
(assoc ::climit/executor executor)
(climit/invoke! generate-thumbnail! file))] (climit/invoke! generate-thumbnail! file))]
(sto/put-object! storage params))) (sto/put-object! storage params)))
@@ -349,12 +347,12 @@
(defn- request-email-change! (defn- request-email-change!
[{:keys [::db/conn] :as cfg} {:keys [profile email] :as params}] [{:keys [::db/conn] :as cfg} {:keys [profile email] :as params}]
(let [token (tokens/generate (::setup/props cfg) (let [token (tokens/generate cfg
{:iss :change-email {:iss :change-email
:exp (ct/in-future "15m") :exp (ct/in-future "15m")
:profile-id (:id profile) :profile-id (:id profile)
:email email}) :email email})
ptoken (tokens/generate (::setup/props cfg) ptoken (tokens/generate cfg
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile) :profile-id (:id profile)
:exp (ct/in-future {:days 30})})] :exp (ct/in-future {:days 30})})]
@@ -477,13 +475,16 @@
p.fullname AS name, p.fullname AS name,
p.email AS email p.email AS email
FROM team_profile_rel AS tpr1 FROM team_profile_rel AS tpr1
JOIN team as t
ON tpr1.team_id = t.id
JOIN team_profile_rel AS tpr2 JOIN team_profile_rel AS tpr2
ON (tpr1.team_id = tpr2.team_id) ON (tpr1.team_id = tpr2.team_id)
JOIN profile AS p JOIN profile AS p
ON (tpr2.profile_id = p.id) ON (tpr2.profile_id = p.id)
WHERE tpr1.profile_id = ? WHERE tpr1.profile_id = ?
AND tpr1.is_owner IS true AND tpr1.is_owner IS true
AND tpr2.can_edit IS true") AND tpr2.can_edit IS true
AND t.deleted_at IS NULL")
(sv/defmethod ::get-subscription-usage (sv/defmethod ::get-subscription-usage
{::doc/added "2.9"} {::doc/added "2.9"}
@@ -548,15 +549,6 @@
[props] [props]
(into {} (filter (fn [[k _]] (simple-ident? k))) props)) (into {} (filter (fn [[k _]] (simple-ident? k))) props))
(defn derive-password
[{:keys [::wrk/executor]} password]
(when password
(px/invoke! executor (partial auth/derive-password password))))
(defn verify-password
[{:keys [::wrk/executor]} password password-data]
(px/invoke! executor (partial auth/verify-password password password-data)))
(defn decode-row (defn decode-row
[{:keys [props] :as row}] [{:keys [props] :as row}]
(cond-> row (cond-> row

View File

@@ -37,14 +37,14 @@
;; --- Helpers & Specs ;; --- Helpers & Specs
(def ^:private sql:team-permissions (def ^:private sql:team-permissions
"select tpr.is_owner, "SELECT tpr.is_owner,
tpr.is_admin, tpr.is_admin,
tpr.can_edit tpr.can_edit
from team_profile_rel as tpr FROM team_profile_rel AS tpr
join team as t on (t.id = tpr.team_id) JOIN team AS t ON (t.id = tpr.team_id)
where tpr.profile_id = ? WHERE tpr.profile_id = ?
and tpr.team_id = ? AND tpr.team_id = ?
and t.deleted_at is null") AND t.deleted_at IS NULL")
(defn get-permissions (defn get-permissions
[conn profile-id team-id] [conn profile-id team-id]
@@ -443,13 +443,18 @@
[:team-id ::sm/uuid]]) [:team-id ::sm/uuid]])
(def sql:team-invitations (def sql:team-invitations
"select email_to as email, role, (valid_until < now()) as expired "SELECT email_to AS email,
from team_invitation where team_id = ? order by valid_until desc, created_at desc") role,
(valid_until < ?::timestamptz) AS expired
FROM team_invitation
WHERE team_id = ?
ORDER BY valid_until DESC, created_at DESC")
(defn get-team-invitations (defn get-team-invitations
[conn team-id] [conn team-id]
(->> (db/exec! conn [sql:team-invitations team-id]) (let [now (ct/now)]
(mapv #(update % :role keyword)))) (->> (db/exec! conn [sql:team-invitations now team-id])
(mapv #(update % :role keyword)))))
(sv/defmethod ::get-team-invitations (sv/defmethod ::get-team-invitations
{::doc/added "1.17" {::doc/added "1.17"

View File

@@ -6,6 +6,7 @@
(ns app.rpc.commands.teams-invitations (ns app.rpc.commands.teams-invitations
(:require (:require
[app.binfile.common :as bfc]
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm] [app.common.data.macros :as dm]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
@@ -21,7 +22,6 @@
[app.loggers.audit :as audit] [app.loggers.audit :as audit]
[app.main :as-alias main] [app.main :as-alias main]
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.commands.files :as files]
[app.rpc.commands.profile :as profile] [app.rpc.commands.profile :as profile]
[app.rpc.commands.teams :as teams] [app.rpc.commands.teams :as teams]
[app.rpc.doc :as-alias doc] [app.rpc.doc :as-alias doc]
@@ -43,7 +43,7 @@
(defn- create-invitation-token (defn- create-invitation-token
[cfg {:keys [profile-id valid-until team-id member-id member-email role]}] [cfg {:keys [profile-id valid-until team-id member-id member-email role]}]
(tokens/generate (::setup/props cfg) (tokens/generate cfg
{:iss :team-invitation {:iss :team-invitation
:exp valid-until :exp valid-until
:profile-id profile-id :profile-id profile-id
@@ -54,12 +54,8 @@
(defn- create-profile-identity-token (defn- create-profile-identity-token
[cfg profile-id] [cfg profile-id]
(assert (uuid? profile-id) "expected valid uuid for profile-id")
(dm/assert! (tokens/generate cfg
"expected valid uuid for profile-id"
(uuid? profile-id))
(tokens/generate (::setup/props cfg)
{:iss :profile-identity {:iss :profile-identity
:profile-id profile-id :profile-id profile-id
:exp (ct/in-future {:days 30})})) :exp (ct/in-future {:days 30})}))
@@ -224,8 +220,25 @@
(def ^:private xf:map-email (map :email)) (def ^:private xf:map-email (map :email))
(defn- create-team-invitations (defn- create-team-invitations
[{:keys [::db/conn] :as cfg} {:keys [profile team role emails] :as params}] "Unified function to handle both create and resend team invitations.
(let [emails (set emails) Accepts either:
- emails (set) + role (single role for all emails)
- invitations (vector of {:email :role} maps)"
[{:keys [::db/conn] :as cfg} {:keys [profile team role emails invitations] :as params}]
(let [;; Normalize input to a consistent format: [{:email :role}]
invitation-data (cond
;; Case 1: emails + single role (create invitations style)
(and emails role)
(map (fn [email] {:email email :role role}) emails)
;; Case 2: invitations with individual roles (resend invitations style)
(some? invitations)
invitations
:else
(throw (ex-info "Invalid parameters: must provide either emails+role or invitations" {})))
invitation-emails (into #{} (map :email) invitation-data)
join-requests (->> (get-valid-access-request-profiles conn (:id team)) join-requests (->> (get-valid-access-request-profiles conn (:id team))
(d/index-by :email)) (d/index-by :email))
@@ -235,51 +248,84 @@
invitations (into #{} invitations (into #{}
(comp (comp
;; We don't re-send inviation to ;; We don't re-send invitations to
;; already existing members ;; already existing members
(remove team-members) (remove #(contains? team-members (:email %)))
;; We don't send invitations to ;; We don't send invitations to
;; join-requested members ;; join-requested members
(remove join-requests) (remove #(contains? join-requests (:email %)))
(map (fn [email] (assoc params :email email))) (map (fn [{:keys [email role]}]
(keep (partial create-invitation cfg))) (create-invitation cfg
emails)] (-> params
(assoc :email email)
(assoc :role role)))))
(remove nil?))
invitation-data)]
;; For requested invitations, do not send invitation emails, add ;; For requested invitations, do not send invitation emails, add
;; the user directly to the team ;; the user directly to the team
(->> join-requests (->> join-requests
(filter #(contains? emails (key %))) (filter #(contains? invitation-emails (key %)))
(map val) (map (fn [[email member]]
(run! (partial add-member-to-team conn profile team role))) (let [role (:role (first (filter #(= (:email %) email) invitation-data)))]
(add-member-to-team conn profile team role member))))
(doall))
invitations)) invitations))
(def ^:private schema:create-team-invitations (def ^:private schema:create-team-invitations
[:and
[:map {:title "create-team-invitations"} [:map {:title "create-team-invitations"}
[:team-id ::sm/uuid] [:team-id ::sm/uuid]
[:role types.team/schema:role] ;; Support both formats:
[:emails [::sm/set ::sm/email]]]) ;; 1. emails (set) + role (single role for all)
;; 2. invitations (vector of {:email :role} maps)
[:emails {:optional true} [::sm/set ::sm/email]]
[:role {:optional true} types.team/schema:role]
[:invitations {:optional true} [:vector [:map
[:email ::sm/email]
[:role types.team/schema:role]]]]]
;; Ensure exactly one format is provided
[:fn (fn [params]
(let [has-emails-role (and (contains? params :emails)
(contains? params :role))
has-invitations (contains? params :invitations)]
(and (or has-emails-role has-invitations)
(not (and has-emails-role has-invitations)))))]])
(def ^:private max-invitations-by-request-threshold (def ^:private max-invitations-by-request-threshold
"The number of invitations can be sent in a single rpc request" "The number of invitations can be sent in a single rpc request"
25) 25)
(sv/defmethod ::create-team-invitations (sv/defmethod ::create-team-invitations
"A rpc call that allow to send a single or multiple invitations to "A rpc call that allows to send single or multiple invitations to join the team.
join the team."
Supports two parameter formats:
1. emails (set) + role (single role for all emails)
2. invitations (vector of {:email :role} maps for individual roles)"
{::doc/added "1.17" {::doc/added "1.17"
::doc/module :teams ::doc/module :teams
::sm/params schema:create-team-invitations} ::sm/params schema:create-team-invitations}
[cfg {:keys [::rpc/profile-id team-id emails] :as params}] [cfg {:keys [::rpc/profile-id team-id role emails] :as params}]
(let [perms (teams/get-permissions cfg profile-id team-id) (let [perms (teams/get-permissions cfg profile-id team-id)
profile (db/get-by-id cfg :profile profile-id) profile (db/get-by-id cfg :profile profile-id)
emails (into #{} (map profile/clean-email) emails)] ;; Determine which format is being used
using-emails-format? (and emails role)
;; Handle both parameter formats
emails (if using-emails-format?
(into #{} (map profile/clean-email) emails)
#{})
;; Calculate total invitation count for both formats
invitation-count (if using-emails-format?
(count emails)
(count (:invitations params)))]
(when-not (:is-admin perms) (when-not (:is-admin perms)
(ex/raise :type :validation (ex/raise :type :validation
:code :insufficient-permissions)) :code :insufficient-permissions))
(when (> (count emails) max-invitations-by-request-threshold) (when (> invitation-count max-invitations-by-request-threshold)
(ex/raise :type :validation (ex/raise :type :validation
:code :max-invitations-by-request :code :max-invitations-by-request
:hint "the maximum of invitation on single request is reached" :hint "the maximum of invitation on single request is reached"
@@ -288,7 +334,7 @@
(-> cfg (-> cfg
(assoc ::quotes/profile-id profile-id) (assoc ::quotes/profile-id profile-id)
(assoc ::quotes/team-id team-id) (assoc ::quotes/team-id team-id)
(assoc ::quotes/incr (count emails)) (assoc ::quotes/incr invitation-count)
(quotes/check! {::quotes/id ::quotes/invitations-per-team} (quotes/check! {::quotes/id ::quotes/invitations-per-team}
{::quotes/id ::quotes/profiles-per-team})) {::quotes/id ::quotes/profiles-per-team}))
@@ -304,7 +350,12 @@
(-> params (-> params
(assoc :profile profile) (assoc :profile profile)
(assoc :team team) (assoc :team team)
(assoc :emails emails)))] ;; Pass parameters in the correct format for the unified function
(cond-> using-emails-format?
;; If using emails+role format, ensure both are present
(assoc :emails emails :role role)
;; If using invitations format, the :invitations key is already in params
(not using-emails-format?) identity)))]
(with-meta {:total (count invitations) (with-meta {:total (count invitations)
:invitations invitations} :invitations invitations}
@@ -467,7 +518,7 @@
(defn- check-existing-team-access-request (defn- check-existing-team-access-request
"Checks if an existing team access request is still valid" "Checks if an existing team access request is still valid"
[conn team-id profile-id] [{:keys [::db/conn]} team-id profile-id]
(when-let [request (db/get* conn :team-access-request (when-let [request (db/get* conn :team-access-request
{:team-id team-id {:team-id team-id
:requester-id profile-id})] :requester-id profile-id})]
@@ -485,8 +536,8 @@
(defn- upsert-team-access-request (defn- upsert-team-access-request
"Create or update team access request for provided team and profile-id" "Create or update team access request for provided team and profile-id"
[conn team-id requester-id] [{:keys [::db/conn] :as cfg} team-id requester-id]
(check-existing-team-access-request conn team-id requester-id) (check-existing-team-access-request cfg team-id requester-id)
(let [valid-until (ct/in-future {:hours 24}) (let [valid-until (ct/in-future {:hours 24})
auto-join-until (ct/in-future {:days 7}) auto-join-until (ct/in-future {:days 7})
request-id (uuid/next)] request-id (uuid/next)]
@@ -499,7 +550,7 @@
"A specific method for obtain a file with name and page-id used for "A specific method for obtain a file with name and page-id used for
team request access procediment" team request access procediment"
[cfg file-id] [cfg file-id]
(let [file (files/get-file cfg file-id :migrate? false)] (let [file (bfc/get-file cfg file-id :migrate? false)]
(-> file (-> file
(dissoc :data) (dissoc :data)
(dissoc :deleted-at) (dissoc :deleted-at)
@@ -548,7 +599,7 @@
(teams/check-email-bounce conn (:email team-owner) false) (teams/check-email-bounce conn (:email team-owner) false)
(teams/check-email-spam conn (:email team-owner) true) (teams/check-email-spam conn (:email team-owner) true)
(let [request (upsert-team-access-request conn team-id profile-id) (let [request (upsert-team-access-request cfg team-id profile-id)
factory (cond factory (cond
(and (some? file) (:is-default team) is-viewer) (and (some? file) (:is-default team) is-viewer)
eml/request-file-access-yourpenpot-view eml/request-file-access-yourpenpot-view

View File

@@ -38,7 +38,7 @@
::doc/module :auth ::doc/module :auth
::sm/params schema:verify-token} ::sm/params schema:verify-token}
[cfg {:keys [token] :as params}] [cfg {:keys [token] :as params}]
(let [claims (tokens/verify (::setup/props cfg) {:token token})] (let [claims (tokens/verify cfg {:token token})]
(db/tx-run! cfg process-token params claims))) (db/tx-run! cfg process-token params claims)))
(defmethod process-token :change-email (defmethod process-token :change-email

View File

@@ -51,7 +51,7 @@
(defn- get-view-only-bundle (defn- get-view-only-bundle
[{:keys [::db/conn] :as cfg} {:keys [profile-id file-id ::perms] :as params}] [{:keys [::db/conn] :as cfg} {:keys [profile-id file-id ::perms] :as params}]
(let [file (files/get-file cfg file-id) (let [file (bfc/get-file cfg file-id)
project (db/get conn :project project (db/get conn :project
{:id (:project-id file)} {:id (:project-id file)}
@@ -81,7 +81,7 @@
libs (->> (bfc/get-file-libraries conn file-id) libs (->> (bfc/get-file-libraries conn file-id)
(mapv (fn [{:keys [id] :as lib}] (mapv (fn [{:keys [id] :as lib}]
(merge lib (files/get-file cfg id))))) (merge lib (bfc/get-file cfg id)))))
links (->> (db/query conn :share-link {:file-id file-id}) links (->> (db/query conn :share-link {:file-id file-id})
(mapv (fn [row] (mapv (fn [row]

View File

@@ -66,13 +66,6 @@
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px])) [promesa.exec :as px]))
(def ^:private default-timeout
(ct/duration 400))
(def ^:private default-options
{:codec rds/string-codec
:timeout default-timeout})
(def ^:private bucket-rate-limit-script (def ^:private bucket-rate-limit-script
{::rscript/name ::bucket-rate-limit {::rscript/name ::bucket-rate-limit
::rscript/path "app/rpc/rlimit/bucket.lua"}) ::rscript/path "app/rpc/rlimit/bucket.lua"})
@@ -177,11 +170,11 @@
:hint (str/ffmt "looks like '%' does not have a valid format" opts)))) :hint (str/ffmt "looks like '%' does not have a valid format" opts))))
(defmethod process-limit :bucket (defmethod process-limit :bucket
[redis user-id now {:keys [::key ::params ::service ::capacity ::interval ::rate] :as limit}] [rconn user-id now {:keys [::key ::params ::service ::capacity ::interval ::rate] :as limit}]
(let [script (-> bucket-rate-limit-script (let [script (-> bucket-rate-limit-script
(assoc ::rscript/keys [(str key "." service "." user-id)]) (assoc ::rscript/keys [(str key "." service "." user-id)])
(assoc ::rscript/vals (conj params (->seconds now)))) (assoc ::rscript/vals (conj params (->seconds now))))
result (rds/eval redis script) result (rds/eval rconn script)
allowed? (boolean (nth result 0)) allowed? (boolean (nth result 0))
remaining (nth result 1) remaining (nth result 1)
reset (* (/ (inst-ms interval) rate) reset (* (/ (inst-ms interval) rate)
@@ -199,13 +192,13 @@
(assoc ::lresult/remaining remaining)))) (assoc ::lresult/remaining remaining))))
(defmethod process-limit :window (defmethod process-limit :window
[redis user-id now {:keys [::nreq ::unit ::key ::service] :as limit}] [rconn user-id now {:keys [::nreq ::unit ::key ::service] :as limit}]
(let [ts (ct/truncate now unit) (let [ts (ct/truncate now unit)
ttl (ct/diff now (ct/plus ts {unit 1})) ttl (ct/diff now (ct/plus ts {unit 1}))
script (-> window-rate-limit-script script (-> window-rate-limit-script
(assoc ::rscript/keys [(str key "." service "." user-id "." (ct/format-inst ts))]) (assoc ::rscript/keys [(str key "." service "." user-id "." (ct/format-inst ts))])
(assoc ::rscript/vals [nreq (->seconds ttl)])) (assoc ::rscript/vals [nreq (->seconds ttl)]))
result (rds/eval redis script) result (rds/eval rconn script)
allowed? (boolean (nth result 0)) allowed? (boolean (nth result 0))
remaining (nth result 1)] remaining (nth result 1)]
(l/trace :hint "limit processed" (l/trace :hint "limit processed"
@@ -220,9 +213,9 @@
(assoc ::lresult/remaining remaining) (assoc ::lresult/remaining remaining)
(assoc ::lresult/reset (ct/plus ts {unit 1}))))) (assoc ::lresult/reset (ct/plus ts {unit 1})))))
(defn- process-limits! (defn- process-limits
[redis user-id limits now] [rconn user-id limits now]
(let [results (into [] (map (partial process-limit redis user-id now)) limits) (let [results (into [] (map (partial process-limit rconn user-id now)) limits)
remaining (->> results remaining (->> results
(d/index-by ::name ::lresult/remaining) (d/index-by ::name ::lresult/remaining)
(uri/map->query-string)) (uri/map->query-string))
@@ -259,34 +252,25 @@
(some-> request inet/parse-request) (some-> request inet/parse-request)
uuid/zero))) uuid/zero)))
(defn process-request! (defn- process-request'
[{:keys [::rpc/rlimit ::rds/redis ::skey ::sname] :as cfg} params] [{:keys [::rds/conn] :as cfg} limits params]
(when-let [limits (get-limits rlimit skey sname)] (try
(let [redis (rds/get-or-connect redis ::rpc/rlimit default-options) (let [uid (get-uid params)
uid (get-uid params) result (process-limits conn uid limits (ct/now))]
;; FIXME: why not clasic try/catch? (if (contains? cf/flags :soft-rpc-rlimit)
result (ex/try! (process-limits! redis uid limits (ct/now)))]
(l/trc :hint "process-limits"
:service sname
:remaining (::remaingin result)
:reset (::reset result))
(cond
(ex/exception? result)
(do
(l/error :hint "error on processing rate-limit" :cause result)
{::enabled false})
(contains? cf/flags :soft-rpc-rlimit)
{::enabled false} {::enabled false}
result))
(catch Throwable cause
(l/error :hint "error on processing rate-limit" :cause cause)
{::enabled false})))
:else (defn- process-request
result)))) [{:keys [::rpc/rlimit ::skey ::sname] :as cfg} params]
(when-let [limits (get-limits rlimit skey sname)]
(rds/run! cfg process-request' limits params)))
(defn wrap (defn wrap
[{:keys [::rpc/rlimit ::rds/redis] :as cfg} f mdata] [{:keys [::rpc/rlimit] :as cfg} f mdata]
(assert (rds/redis? redis) "expected a valid redis instance")
(assert (or (nil? rlimit) (valid-rlimit-instance? rlimit)) "expected a valid rlimit instance") (assert (or (nil? rlimit) (valid-rlimit-instance? rlimit)) "expected a valid rlimit instance")
(if rlimit (if rlimit
@@ -298,7 +282,7 @@
(fn [hcfg params] (fn [hcfg params]
(if @enabled (if @enabled
(let [result (process-request! cfg params)] (let [result (process-request cfg params)]
(if (::enabled result) (if (::enabled result)
(if (::allowed result) (if (::allowed result)
(-> (f hcfg params) (-> (f hcfg params)
@@ -399,7 +383,7 @@
(when-let [path (cf/get :rpc-rlimit-config)] (when-let [path (cf/get :rpc-rlimit-config)]
(and (fs/exists? path) (fs/regular-file? path) path))) (and (fs/exists? path) (fs/regular-file? path) path)))
(defmethod ig/assert-key :app.rpc/rlimit (defmethod ig/assert-key ::rpc/rlimit
[_ {:keys [::wrk/executor]}] [_ {:keys [::wrk/executor]}]
(assert (sm/valid? ::wrk/executor executor) "expect valid executor")) (assert (sm/valid? ::wrk/executor executor) "expect valid executor"))

View File

@@ -0,0 +1,48 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.setup.clock
"A service/module that manages the system clock and allows runtime
modification of time offset (useful for testing and time adjustments)."
(:require
[app.common.logging :as l]
[app.common.time :as ct]
[app.setup :as-alias setup]
[integrant.core :as ig])
(:import
java.time.Clock
java.time.Duration))
(defonce current
(atom {:clock (Clock/systemDefaultZone)
:offset nil}))
(defmethod ig/init-key ::setup/clock
[_ _]
(add-watch current ::common
(fn [_ _ _ {:keys [clock offset]}]
(let [clock (if (ct/duration? offset)
(Clock/offset ^Clock clock
^Duration offset)
clock)]
(l/wrn :hint "altering clock" :clock (str clock))
(alter-var-root #'ct/*clock* (constantly clock))))))
(defmethod ig/halt-key! ::setup/clock
[_ _]
(remove-watch current ::common))
(defn set-offset!
[duration]
(swap! current assoc :offset (some-> duration ct/duration)))
(defn set-clock!
([]
(swap! current assoc :clock (Clock/systemDefaultZone)))
([clock]
(when (instance? Clock clock)
(swap! current assoc :clock clock))))

View File

@@ -7,7 +7,7 @@
(ns app.srepl.cli (ns app.srepl.cli
"PREPL API for external usage (CLI or ADMIN)" "PREPL API for external usage (CLI or ADMIN)"
(:require (:require
[app.auth :as auth] [app.auth :refer [derive-password]]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.schema.generators :as sg] [app.common.schema.generators :as sg]
@@ -54,7 +54,7 @@
(some-> (get-current-system) (some-> (get-current-system)
(db/tx-run! (db/tx-run!
(fn [{:keys [::db/conn] :as system}] (fn [{:keys [::db/conn] :as system}]
(let [password (cmd.profile/derive-password system password) (let [password (derive-password password)
params {:id (uuid/next) params {:id (uuid/next)
:email email :email email
:fullname fullname :fullname fullname
@@ -74,7 +74,7 @@
(assoc :fullname fullname) (assoc :fullname fullname)
(some? password) (some? password)
(assoc :password (auth/derive-password password)) (assoc :password (derive-password password))
(some? is-active) (some? is-active)
(assoc :is-active is-active))] (assoc :is-active is-active))]
@@ -124,13 +124,12 @@
(defmethod exec-command "derive-password" (defmethod exec-command "derive-password"
[{:keys [password]}] [{:keys [password]}]
(auth/derive-password password)) (derive-password password))
(defmethod exec-command "authenticate" (defmethod exec-command "authenticate"
[{:keys [token]}] [{:keys [token]}]
(when-let [system (get-current-system)] (when-let [system (get-current-system)]
(let [props (get system ::setup/props)] (tokens/verify system {:token token :iss "authentication"})))
(tokens/verify props {:token token :iss "authentication"}))))
(def ^:private schema:get-customer (def ^:private schema:get-customer
[:map [:id ::sm/uuid]]) [:map [:id ::sm/uuid]])

View File

@@ -1,278 +0,0 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.srepl.fixes
"A misc of fix functions"
(:refer-clojure :exclude [parse-uuid])
(:require
[app.binfile.common :as bfc]
[app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.files.changes :as cpc]
[app.common.files.helpers :as cfh]
[app.common.files.repair :as cfr]
[app.common.files.validate :as cfv]
[app.common.logging :as l]
[app.common.types.component :as ctk]
[app.common.types.container :as ctn]
[app.common.types.file :as ctf]
[app.common.uuid :as uuid]
[app.db :as db]
[app.features.fdata :as feat.fdata]
[app.srepl.helpers :as h]))
(defn disable-fdata-features
[{:keys [id features] :as file} _]
(when (or (contains? features "fdata/pointer-map")
(contains? features "fdata/objects-map"))
(l/warn :hint "disable fdata features" :file-id (str id))
(-> file
(update :data feat.fdata/process-pointers deref)
(update :data feat.fdata/process-objects (partial into {}))
(update :features disj "fdata/pointer-map" "fdata/objects-map"))))
(def sql:get-fdata-files
"SELECT id FROM file
WHERE deleted_at is NULL
AND (features @> '{fdata/pointer-map}' OR
features @> '{fdata/objects-map}')
ORDER BY created_at DESC")
(defn find-fdata-pointers
[{:keys [id features data] :as file} _]
(when (contains? features "fdata/pointer-map")
(let [pointers (feat.fdata/get-used-pointer-ids data)]
(l/warn :hint "found pointers" :file-id (str id) :pointers pointers)
nil)))
(defn repair-file-media
"A helper intended to be used with `srepl.main/process-files!` that
fixes all not propertly referenced file-media-object for a file"
[{:keys [id data] :as file} & _]
(let [conn (db/get-connection h/*system*)
used (cfh/collect-used-media data)
ids (db/create-array conn "uuid" used)
sql "SELECT * FROM file_media_object WHERE id = ANY(?)"
rows (db/exec! conn [sql ids])
index (reduce (fn [index media]
(if (not= (:file-id media) id)
(let [media-id (uuid/next)]
(l/wrn :hint "found not referenced media"
:file-id (str id)
:media-id (str (:id media)))
(db/insert! conn :file-media-object
(-> media
(assoc :file-id id)
(assoc :id media-id)))
(assoc index (:id media) media-id))
index))
{}
rows)]
(when (seq index)
(binding [bfc/*state* (atom {:index index})]
(update file :data (fn [fdata]
(-> fdata
(update :pages-index #'bfc/relink-shapes)
(update :components #'bfc/relink-shapes)
(update :media #'bfc/relink-media)
(d/without-nils))))))))
(defn repair-file
"Internal helper for validate and repair the file. The operation is
applied multiple times untile file is fixed or max iteration counter
is reached (default 10)"
[file libs & {:keys [max-iterations] :or {max-iterations 10}}]
(let [validate-and-repair
(fn [file libs iteration]
(when-let [errors (not-empty (cfv/validate-file file libs))]
(l/trc :hint "repairing file"
:file-id (str (:id file))
:iteration iteration
:errors (count errors))
(let [changes (cfr/repair-file file libs errors)]
(-> file
(update :revn inc)
(update :data cpc/process-changes changes)))))
process-file
(fn [file libs]
(loop [file file
iteration 0]
(if (< iteration max-iterations)
(if-let [file (validate-and-repair file libs iteration)]
(recur file (inc iteration))
file)
(do
(l/wrn :hint "max retry num reached on repairing file"
:file-id (str (:id file))
:iteration iteration)
file))))
file'
(process-file file libs)]
(when (not= (:revn file) (:revn file'))
(l/trc :hint "file repaired" :file-id (str (:id file))))
file'))
(defn fix-touched-shapes-group
[file _]
;; Remove :shapes-group from the touched elements
(letfn [(fix-fdata [data]
(-> data
(update :pages-index update-vals fix-container)))
(fix-container [container]
(d/update-when container :objects update-vals fix-shape))
(fix-shape [shape]
(d/update-when shape :touched
(fn [touched]
(disj touched :shapes-group))))]
file (-> file
(update :data fix-fdata))))
(defn add-swap-slots
[file libs _opts]
;; Detect swapped copies and try to generate a valid swap-slot.
(letfn [(process-fdata [data]
;; Walk through all containers in the file, both pages and deleted components.
(reduce process-container data (ctf/object-containers-seq data)))
(process-container [data container]
;; Walk through all shapes in depth-first tree order.
(l/dbg :hint "Processing container" :type (:type container) :name (:name container))
(let [root-shape (ctn/get-container-root container)]
(ctf/update-container data
container
#(reduce process-shape % (ctn/get-direct-children container root-shape)))))
(process-shape [container shape]
;; Look for head copies in the first level (either component roots or inside main components).
;; Even if they have been swapped, we don't add slot to them because there is no way to know
;; the original shape. Only children.
(if (and (ctk/instance-head? shape)
(ctk/in-component-copy? shape)
(nil? (ctk/get-swap-slot shape)))
(process-copy-head container shape)
(reduce process-shape container (ctn/get-direct-children container shape))))
(process-copy-head [container head-shape]
;; Process recursively all children, comparing each one with the corresponding child in the main
;; component, looking by position. If the shape-ref does not point to the found child, then it has
;; been swapped and need to set up a slot.
(l/trc :hint "Processing copy-head" :id (:id head-shape) :name (:name head-shape))
(let [component-shape (ctf/find-ref-shape file container libs head-shape :include-deleted? true :with-context? true)
component-container (:container (meta component-shape))]
(loop [container container
children (map #(ctn/get-shape container %) (:shapes head-shape))
component-children (map #(ctn/get-shape component-container %) (:shapes component-shape))]
(let [child (first children)
component-child (first component-children)]
(if (or (nil? child) (nil? component-child))
container
(let [container (if (and (not (ctk/is-main-of? component-child child))
(nil? (ctk/get-swap-slot child))
(ctk/instance-head? child))
(let [slot (guess-swap-slot component-child component-container)]
(l/dbg :hint "child" :id (:id child) :name (:name child) :slot slot)
(ctn/update-shape container (:id child) #(ctk/set-swap-slot % slot)))
container)]
(recur (process-copy-head container child)
(rest children)
(rest component-children))))))))
(guess-swap-slot [shape container]
;; To guess the slot, we must follow the chain until we find the definitive main. But
;; we cannot navigate by shape-ref, because main shapes may also have been swapped. So
;; chain by position, too.
(if-let [slot (ctk/get-swap-slot shape)]
slot
(if-not (ctk/in-component-copy? shape)
(:id shape)
(let [head-copy (ctn/get-component-shape (:objects container) shape)]
(if (= (:id head-copy) (:id shape))
(:id shape)
(let [head-main (ctf/find-ref-shape file
container
libs
head-copy
:include-deleted? true
:with-context? true)
container-main (:container (meta head-main))
shape-main (find-match-by-position shape
head-copy
container
head-main
container-main)]
(guess-swap-slot shape-main container-main)))))))
(find-match-by-position [shape-copy head-copy container-copy head-main container-main]
;; Find the shape in the main that has the same position under its parent than
;; the copy under its one. To get the parent we must process recursively until
;; the component head, because mains may also have been swapped.
(let [parent-copy (ctn/get-shape container-copy (:parent-id shape-copy))
parent-main (if (= (:id parent-copy) (:id head-copy))
head-main
(find-match-by-position parent-copy
head-copy
container-copy
head-main
container-main))
index (cfh/get-position-on-parent (:objects container-copy)
(:id shape-copy))
shape-main-id (dm/get-in parent-main [:shapes index])]
(ctn/get-shape container-main shape-main-id)))]
file (-> file
(update :data process-fdata))))
(defn fix-find-duplicated-slots
[file _]
;; Find the shapes whose children have duplicated slots
(let [check-duplicate-swap-slot
(fn [shape page]
(let [shapes (map #(get (:objects page) %) (:shapes shape))
slots (->> (map #(ctk/get-swap-slot %) shapes)
(remove nil?))
counts (frequencies slots)]
#_(when (some (fn [[_ count]] (> count 1)) counts)
(l/trc :info "This shape has children with the same swap slot" :id (:id shape) :file-id (str (:id file))))
(some (fn [[_ count]] (> count 1)) counts)))
count-slots-shape
(fn [page shape]
(if (ctk/instance-root? shape)
(check-duplicate-swap-slot shape page)
false))
count-slots-page
(fn [page]
(->> (:objects page)
(vals)
(mapv #(count-slots-shape page %))
(filter true?)
count))
count-slots-data
(fn [data]
(->> (:pages-index data)
(vals)
(mapv count-slots-page)
(reduce +)))
num-missing-slots (count-slots-data (:data file))]
(when (pos? num-missing-slots)
(l/trc :info (str "Shapes with children with the same swap slot: " num-missing-slots) :file-id (str (:id file))))
file))

View File

@@ -1,88 +0,0 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.srepl.fixes.lost-colors
"A collection of adhoc fixes scripts."
(:require
[app.binfile.common :as bfc]
[app.common.logging :as l]
[app.common.types.color :as types.color]
[app.db :as db]
[app.srepl.helpers :as h]))
(def sql:get-affected-files
"SELECT fm.file_id AS id FROM file_migration AS fm WHERE fm.name = '0008-fix-library-colors-v2'")
(def sql:get-matching-snapshot
"SELECT * FROM file_change
WHERE file_id = ?
AND created_at <= ?
AND label IS NOT NULL
AND data IS NOT NULL
ORDER BY created_at DESC
LIMIT 2")
(defn get-affected-migration
[conn file-id]
(db/get* conn :file-migration
{:name "0008-fix-library-colors-v2"
:file-id file-id}))
(defn get-last-valid-snapshot
[conn migration]
(let [[snapshot] (db/exec! conn [sql:get-matching-snapshot
(:file-id migration)
(:created-at migration)])]
(when snapshot
(let [snapshot (assoc snapshot :id (:file-id snapshot))]
(bfc/decode-file h/*system* snapshot)))))
(defn restore-color
[{:keys [data] :as snapshot} color]
(when-let [scolor (get-in data [:colors (:id color)])]
(-> (select-keys scolor types.color/library-color-attrs)
(types.color/check-library-color))))
(defn restore-missing-colors
[{:keys [id] :as file} & _opts]
(l/inf :hint "process file" :file-id (str id) :name (:name file) :has-colors (-> file :data :colors not-empty boolean))
(if-let [colors (-> file :data :colors not-empty)]
(let [migration (get-affected-migration h/*system* id)]
(if-let [snapshot (get-last-valid-snapshot h/*system* migration)]
(do
(l/inf :hint "using snapshot" :snapshot (:label snapshot))
(let [colors (reduce-kv (fn [colors color-id color]
(if-let [result (restore-color snapshot color)]
(do
(l/inf :hint "restored color" :file-id (str id) :color-id (str color-id))
(assoc colors color-id result))
(do
(l/wrn :hint "ignoring color" :file-id (str id) :color (pr-str color))
colors)))
colors
colors)
file (-> file
(update :data assoc :colors colors)
(update :migrations disj "0008-fix-library-colors-v2"))]
(db/delete! h/*system* :file-migration
{:name "0008-fix-library-colors-v2"
:file-id (:id file)})
file))
(do
(db/delete! h/*system* :file-migration
{:name "0008-fix-library-colors-v2"
:file-id (:id file)})
nil)))
(do
(db/delete! h/*system* :file-migration
{:name "0008-fix-library-colors-v2"
:file-id (:id file)})
nil)))

View File

@@ -14,9 +14,8 @@
[app.common.files.validate :as cfv] [app.common.files.validate :as cfv]
[app.common.time :as ct] [app.common.time :as ct]
[app.db :as db] [app.db :as db]
[app.main :as main] [app.features.file-snapshots :as fsnap]
[app.rpc.commands.files :as files] [app.main :as main]))
[app.rpc.commands.files-snapshot :as fsnap]))
(def ^:dynamic *system* nil) (def ^:dynamic *system* nil)
@@ -48,7 +47,7 @@
([system id] ([system id]
(db/run! system (db/run! system
(fn [system] (fn [system]
(files/get-file system id :migrate? false))))) (bfc/get-file system id :decode? false)))))
(defn update-team! (defn update-team!
[system {:keys [id] :as team}] [system {:keys [id] :as team}]
@@ -118,10 +117,10 @@
(let [conn (db/get-connection system)] (let [conn (db/get-connection system)]
(->> (get-and-lock-team-files conn team-id) (->> (get-and-lock-team-files conn team-id)
(reduce (fn [result file-id] (reduce (fn [result file-id]
(let [file (fsnap/get-file-snapshots system file-id)] (let [file (bfc/get-file system file-id :realize? true :lock-for-update? true)]
(fsnap/create-file-snapshot! system file (fsnap/create! system file
{:label label {:label label
:created-by :admin}) :created-by "admin"})
(inc result))) (inc result)))
0)))) 0))))
@@ -132,21 +131,34 @@
(into #{})) (into #{}))
snap (search-file-snapshots conn ids label) snap (search-file-snapshots conn ids label)
ids' (into #{} (map :file-id) snap)] ids' (into #{} (map :file-id) snap)]
(when (not= ids ids') (when (not= ids ids')
(throw (RuntimeException. "no uniform snapshot available"))) (throw (RuntimeException. "no uniform snapshot available")))
(reduce (fn [result {:keys [file-id id]}] (reduce (fn [result {:keys [file-id id]}]
(fsnap/restore-file-snapshot! system file-id id) (fsnap/restore! system file-id id)
(inc result)) (inc result))
0 0
snap))) snap)))
(defn mark-migrated!
"A helper that inserts an entry in the file migration table for make
file migrated for the specified migration label."
[system file-id label]
(db/insert! system :file-migration
{:file-id file-id
:name label}
{::db/return-keys false}))
(defn process-file! (defn process-file!
[system file-id update-fn & {:keys [label validate? with-libraries?] :or {validate? true} :as opts}] [system file-id update-fn
(let [file (bfc/get-file system file-id ::db/for-update true) & {:keys [::snapshot-label ::validate? ::with-libraries?]
:or {validate? true} :as opts}]
(let [file (bfc/get-file system file-id
:lock-for-update? true
:realize? true)
libs (when with-libraries? libs (when with-libraries?
(bfc/get-resolved-file-libraries system file)) (bfc/get-resolved-file-libraries system file))
@@ -162,12 +174,12 @@
(when validate? (when validate?
(cfv/validate-file-schema! file')) (cfv/validate-file-schema! file'))
(when (string? label) (when (string? snapshot-label)
(fsnap/create-file-snapshot! system file (fsnap/create! system file
{:label label {:label snapshot-label
:deleted-at (ct/in-future {:days 30}) :deleted-at (ct/in-future {:days 30})
:created-by :admin})) :created-by "admin"}))
(let [file' (update file' :revn inc)] (let [file' (update file' :revn inc)]
(bfc/update-file! system file') (bfc/update-file! system file' opts)
true)))) true))))

View File

@@ -5,7 +5,6 @@
;; Copyright (c) KALEIDOS INC ;; Copyright (c) KALEIDOS INC
(ns app.srepl.main (ns app.srepl.main
"A collection of adhoc fixes scripts."
#_:clj-kondo/ignore #_:clj-kondo/ignore
(:require (:require
[app.auth :refer [derive-password]] [app.auth :refer [derive-password]]
@@ -24,19 +23,19 @@
[app.config :as cf] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.db.sql :as-alias sql] [app.db.sql :as-alias sql]
[app.features.fdata :as feat.fdata] [app.features.fdata :as fdata]
[app.features.file-snapshots :as fsnap]
[app.loggers.audit :as audit] [app.loggers.audit :as audit]
[app.main :as main] [app.main :as main]
[app.msgbus :as mbus] [app.msgbus :as mbus]
[app.rpc.commands.auth :as auth] [app.rpc.commands.auth :as auth]
[app.rpc.commands.files :as files] [app.rpc.commands.files :as files]
[app.rpc.commands.files-snapshot :as fsnap]
[app.rpc.commands.management :as mgmt] [app.rpc.commands.management :as mgmt]
[app.rpc.commands.profile :as profile] [app.rpc.commands.profile :as profile]
[app.rpc.commands.projects :as projects] [app.rpc.commands.projects :as projects]
[app.rpc.commands.teams :as teams] [app.rpc.commands.teams :as teams]
[app.srepl.fixes :as fixes]
[app.srepl.helpers :as h] [app.srepl.helpers :as h]
[app.srepl.procs.file-repair :as procs.file-repair]
[app.util.blob :as blob] [app.util.blob :as blob]
[app.util.pointer-map :as pmap] [app.util.pointer-map :as pmap]
[app.worker :as wrk] [app.worker :as wrk]
@@ -48,6 +47,7 @@
[cuerdas.core :as str] [cuerdas.core :as str]
[datoteka.fs :as fs] [datoteka.fs :as fs]
[promesa.exec :as px] [promesa.exec :as px]
[promesa.exec.csp :as sp]
[promesa.exec.semaphore :as ps] [promesa.exec.semaphore :as ps]
[promesa.util :as pu])) [promesa.util :as pu]))
@@ -147,25 +147,6 @@
;; FEATURES ;; FEATURES
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(declare process-file!)
(defn enable-objects-map-feature-on-file!
[file-id & {:as opts}]
(process-file! file-id feat.fdata/enable-objects-map opts))
(defn enable-pointer-map-feature-on-file!
[file-id & {:as opts}]
(process-file! file-id feat.fdata/enable-pointer-map opts))
(defn enable-path-data-feature-on-file!
[file-id & {:as opts}]
(process-file! file-id feat.fdata/enable-path-data opts))
(defn enable-storage-features-on-file!
[file-id & {:as opts}]
(enable-objects-map-feature-on-file! file-id opts)
(enable-pointer-map-feature-on-file! file-id opts))
(defn enable-team-feature! (defn enable-team-feature!
[team-id feature & {:keys [skip-check] :or {skip-check false}}] [team-id feature & {:keys [skip-check] :or {skip-check false}}]
(when (and (not skip-check) (not (contains? cfeat/supported-features feature))) (when (and (not skip-check) (not (contains? cfeat/supported-features feature)))
@@ -339,7 +320,10 @@
collectable file-changes entry." collectable file-changes entry."
[& {:keys [file-id label]}] [& {:keys [file-id label]}]
(let [file-id (h/parse-uuid file-id)] (let [file-id (h/parse-uuid file-id)]
(db/tx-run! main/system fsnap/create-file-snapshot! {:file-id file-id :label label}))) (db/tx-run! main/system
(fn [cfg]
(let [file (bfc/get-file cfg file-id :realize? true)]
(fsnap/create! cfg file {:label label :created-by "admin"}))))))
(defn restore-file-snapshot! (defn restore-file-snapshot!
[file-id & {:keys [label id]}] [file-id & {:keys [label id]}]
@@ -349,13 +333,13 @@
(fn [{:keys [::db/conn] :as system}] (fn [{:keys [::db/conn] :as system}]
(cond (cond
(uuid? snapshot-id) (uuid? snapshot-id)
(fsnap/restore-file-snapshot! system file-id snapshot-id) (fsnap/restore! system file-id snapshot-id)
(string? label) (string? label)
(->> (h/search-file-snapshots conn #{file-id} label) (->> (h/search-file-snapshots conn #{file-id} label)
(map :id) (map :id)
(first) (first)
(fsnap/restore-file-snapshot! system file-id)) (fsnap/restore! system file-id))
:else :else
(throw (ex-info "snapshot id or label should be provided" {}))))))) (throw (ex-info "snapshot id or label should be provided" {})))))))
@@ -364,9 +348,9 @@
[file-id & {:as _}] [file-id & {:as _}]
(let [file-id (h/parse-uuid file-id)] (let [file-id (h/parse-uuid file-id)]
(db/tx-run! main/system (db/tx-run! main/system
(fn [{:keys [::db/conn]}] (fn [cfg]
(->> (fsnap/get-file-snapshots conn file-id) (->> (fsnap/get-visible-snapshots cfg file-id)
(print-table [:label :id :revn :created-at])))))) (print-table [:label :id :revn :created-at :created-by]))))))
(defn take-team-snapshot! (defn take-team-snapshot!
[team-id & {:keys [label rollback?] :or {rollback? true}}] [team-id & {:keys [label rollback?] :or {rollback? true}}]
@@ -413,24 +397,19 @@
(println (sm/humanize-explain explain)) (println (sm/humanize-explain explain))
(ex/print-throwable cause)))))))) (ex/print-throwable cause))))))))
(defn repair-file!
"Repair the list of errors detected by validation."
[file-id & {:keys [rollback?] :or {rollback? true} :as opts}]
(let [system (assoc main/system ::db/rollback rollback?)
file-id (h/parse-uuid file-id)
opts (assoc opts :with-libraries? true)]
(db/tx-run! system h/process-file! file-id fixes/repair-file opts)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; PROCESSING ;; PROCESSING
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(def sql:get-files (defn repair-file!
"SELECT id FROM file "Repair the list of errors detected by validation."
WHERE deleted_at is NULL [file-id & {:keys [rollback?] :or {rollback? true} :as options}]
ORDER BY created_at DESC") (let [system (assoc main/system ::db/rollback rollback?)
file-id (h/parse-uuid file-id)
options (assoc options ::h/with-libraries? true)]
(db/tx-run! system h/process-file! file-id procs.file-repair/repair-file options)))
(defn process-file! (defn update-file!
"Apply a function to the file. Optionally save the changes or not. "Apply a function to the file. Optionally save the changes or not.
The function receives the decoded and migrated file data." The function receives the decoded and migrated file data."
[file-id update-fn & {:keys [rollback?] :or {rollback? true} :as opts}] [file-id update-fn & {:keys [rollback?] :or {rollback? true} :as opts}]
@@ -441,114 +420,117 @@
db/*conn* (db/get-connection system)] db/*conn* (db/get-connection system)]
(h/process-file! system file-id update-fn opts)))))) (h/process-file! system file-id update-fn opts))))))
(defn process-team-files! (defn process!
"Apply a function to each file of the specified team." [& {:keys [max-items
[team-id update-fn & {:keys [rollback? label] :or {rollback? true} :as opts}]
(let [team-id (h/parse-uuid team-id)
opts (dissoc opts :label)]
(db/tx-run! (assoc main/system ::db/rollback rollback?)
(fn [{:keys [::db/conn] :as system}]
(when (string? label)
(h/take-team-snapshot! system team-id label))
(binding [h/*system* system
db/*conn* (db/get-connection system)]
(->> (h/get-and-lock-team-files conn team-id)
(reduce (fn [result file-id]
(if (h/process-file! system file-id update-fn opts)
(inc result)
result))
0)))))))
(defn process-files!
"Apply a function to all files in the database"
[update-fn & {:keys [max-items
max-jobs max-jobs
rollback? rollback?
query] query
:or {max-jobs 1 proc-fn
max-items Long/MAX_VALUE buffer]
:or {max-items Long/MAX_VALUE
rollback? true rollback? true
query sql:get-files} max-jobs 1
buffer 128}
:as opts}] :as opts}]
(l/dbg :hint "process:start" (l/inf :hint "process start"
:rollback rollback? :rollback rollback?
:max-jobs max-jobs :max-jobs max-jobs
:max-items max-items) :max-items max-items)
(let [tpoint (ct/tpoint) (let [tpoint (ct/tpoint)
factory (px/thread-factory :virtual false :prefix "penpot/file-process/") max-jobs (or max-jobs (px/get-available-processors))
executor (px/cached-executor :factory factory) query (or query
sjobs (ps/create :permits max-jobs) (:query (meta proc-fn))
(throw (ex-info "missing query" {})))
query (if (vector? query) query [query])
process-file proc-fn (if (var? proc-fn)
(fn [file-id idx tpoint] (deref proc-fn)
(let [thread-id (px/get-thread-id)] proc-fn)
in-ch (sp/chan :buf buffer)
worker-fn
(fn [worker-id]
(l/dbg :hint "worker started"
:id worker-id)
(loop []
(when-let [[index item] (sp/<! in-ch)]
(l/dbg :hint "process item" :worker-id worker-id :index index :item item)
(try (try
(l/trc :hint "process:file:start" (-> main/system
:tid thread-id (assoc ::db/rollback rollback?)
:file-id (str file-id) (db/tx-run! (fn [system]
:index idx)
(let [system (assoc main/system ::db/rollback rollback?)]
(db/tx-run! system (fn [system]
(binding [h/*system* system (binding [h/*system* system
db/*conn* (db/get-connection system)] db/*conn* (db/get-connection system)]
(h/process-file! system file-id update-fn opts))))) (proc-fn system item opts)))))
(catch Throwable cause (catch Throwable cause
(l/wrn :hint "unexpected error on processing file (skiping)" (l/wrn :hint "unexpected error on processing item (skiping)"
:tid thread-id :worker-id worker-id
:file-id (str file-id) :item item
:index idx
:cause cause)) :cause cause))
(finally (finally
(when-let [pause (:pause opts)] (when-let [pause (:pause opts)]
(Thread/sleep (int pause))) (Thread/sleep (int pause)))))
(ps/release! sjobs) (recur)))
(let [elapsed (ct/format-duration (tpoint))]
(l/trc :hint "process:file:end"
:tid thread-id
:file-id (str file-id)
:index idx
:elapsed elapsed))))))
process-file* (l/dbg :hint "worker stoped"
(fn [idx file-id] :id worker-id))
(ps/acquire! sjobs)
(px/run! executor (partial process-file file-id idx (ct/tpoint)))
(inc idx))
process-files enqueue-item
(fn [index row]
(sp/>! in-ch [index (into {} row)])
(inc index))
process-items
(fn [{:keys [::db/conn] :as system}] (fn [{:keys [::db/conn] :as system}]
(db/exec! conn ["SET statement_timeout = 0"]) (db/exec! conn ["SET statement_timeout = 0"])
(db/exec! conn ["SET idle_in_transaction_session_timeout = 0"]) (db/exec! conn ["SET idle_in_transaction_session_timeout = 0"])
(try (->> (db/plan conn query {:fetch-size (* max-jobs 3)})
(->> (db/plan conn [query]) (transduce (take max-items)
(transduce (comp (completing enqueue-item)
(take max-items)
(map :id))
(completing process-file*)
0)) 0))
(finally (sp/close! in-ch))
;; Close and await tasks
(pu/close! executor))))] threads
(->> (range max-jobs)
(map (fn [idx]
(px/fn->thread (partial worker-fn idx)
:name (str "pentpot/process/" idx))))
(doall))]
(try (try
(db/tx-run! main/system process-files) (db/tx-run! main/system process-items)
;; Await threads termination
(doseq [thread threads]
(px/await! thread))
(catch Throwable cause (catch Throwable cause
(l/dbg :hint "process:error" :cause cause)) (l/dbg :hint "process:error" :cause cause))
(finally (finally
(let [elapsed (ct/format-duration (tpoint))] (let [elapsed (ct/format-duration (tpoint))]
(l/dbg :hint "process:end" (l/inf :hint "process end"
:rollback rollback? :rollback rollback?
:elapsed elapsed)))))) :elapsed elapsed))))))
(defn process-file!
"A specialized, file specific process! alternative"
[& {:keys [id] :as opts}]
(let [id (h/parse-uuid id)]
(-> opts
(assoc :query ["select id from file where id = ?" id])
(assoc :max-items 1)
(assoc :max-jobs 1)
(process!))))
(defn mark-file-as-trimmed (defn mark-file-as-trimmed
[id] [id]
(let [id (h/parse-uuid id)] (let [id (h/parse-uuid id)]
@@ -590,25 +572,34 @@
(db/update! conn :file (db/update! conn :file
{:deleted-at nil {:deleted-at nil
:has-media-trimmed false} :has-media-trimmed false}
{:id file-id}) {:id file-id}
{::db/return-keys false})
;; Fragments are not handled here because they
;; use the database cascade operation and they
;; are not marked for deletion with objects-gc
;; task
(db/update! conn :file-media-object (db/update! conn :file-media-object
{:deleted-at nil} {:deleted-at nil}
{:file-id file-id}) {:file-id file-id}
{::db/return-keys false})
(db/update! conn :file-change
{:deleted-at nil}
{:file-id file-id}
{::db/return-keys false})
(db/update! conn :file-data
{:deleted-at nil}
{:file-id file-id}
{::db/return-keys false})
;; Mark thumbnails to be deleted ;; Mark thumbnails to be deleted
(db/update! conn :file-thumbnail (db/update! conn :file-thumbnail
{:deleted-at nil} {:deleted-at nil}
{:file-id file-id}) {:file-id file-id}
{::db/return-keys false})
(db/update! conn :file-tagged-object-thumbnail (db/update! conn :file-tagged-object-thumbnail
{:deleted-at nil} {:deleted-at nil}
{:file-id file-id}) {:file-id file-id}
{::db/return-keys false})
:restored) :restored)
@@ -618,11 +609,10 @@
(let [file-id (h/parse-uuid file-id)] (let [file-id (h/parse-uuid file-id)]
(db/tx-run! main/system (db/tx-run! main/system
(fn [system] (fn [system]
(when-let [file (some-> (db/get* system :file (when-let [file (db/get* system :file
{:id file-id} {:id file-id}
{::db/remove-deleted false {::db/remove-deleted false
::sql/columns [:id :name]}) ::sql/columns [:id :name]})]
(files/decode-row))]
(audit/insert! system (audit/insert! system
{::audit/name "restore-file" {::audit/name "restore-file"
::audit/type "action" ::audit/type "action"

View File

@@ -0,0 +1,141 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.srepl.procs.fdata-storage
(:require
[app.common.logging :as l]
[app.db :as db]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; SNAPSHOTS
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(def sql:get-unmigrated-snapshots
"SELECT fc.id, fc.file_id
FROM file_change AS fc
WHERE fc.data IS NOT NULL
AND fc.label IS NOT NULL
ORDER BY fc.id ASC")
(def sql:get-migrated-snapshots
"SELECT f.id, f.file_id
FROM file_data AS f
WHERE f.data IS NOT NULL
AND f.type = 'snapshot'
AND f.id != f.file_id
ORDER BY f.id ASC")
(defn migrate-snapshot-to-storage
"Migrate the current existing files to store data in new storage
tables."
{:query sql:get-unmigrated-snapshots}
[{:keys [::db/conn]} {:keys [id file-id]} & {:as options}]
(let [{:keys [id file-id data created-at updated-at]}
(db/get* conn :file-change {:id id :file-id file-id}
::db/for-update true
::db/remove-deleted false)]
(when data
(l/inf :hint "migrating snapshot" :file-id (str file-id) :id (str id))
(db/update! conn :file-change
{:data nil}
{:id id :file-id file-id}
{::db/return-keys false})
(db/insert! conn :file-data
{:backend "db"
:metadata nil
:type "snapshot"
:data data
:created-at created-at
:modified-at updated-at
:file-id file-id
:id id}
{::db/return-keys false}))))
(defn rollback-snapshot-from-storage
"Migrate back to the file table storage."
{:query sql:get-unmigrated-snapshots}
[{:keys [::db/conn]} {:keys [id file-id]} & {:as opts}]
(when-let [{:keys [id file-id data]}
(db/get* conn :file-data {:id id :file-id file-id :type "snapshot"}
::db/for-update true
::db/remove-deleted false)]
(l/inf :hint "rollback snapshot" :file-id (str file-id) :id (str id))
(db/update! conn :file-change
{:data data}
{:id id :file-id file-id}
{::db/return-keys false})
(db/delete! conn :file-data
{:id id :file-id file-id :type "snapshot"}
{::db/return-keys false})))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; FILES
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(def sql:get-unmigrated-files
"SELECT f.id
FROM file AS f
WHERE f.data IS NOT NULL
ORDER BY f.modified_at ASC")
(def sql:get-migrated-files
"SELECT f.id, f.file_id
FROM file_data AS f
WHERE f.data IS NOT NULL
AND f.id = f.file_id
ORDER BY f.id ASC")
(defn migrate-file-to-storage
"Migrate the current existing files to store data in new storage
tables."
{:query sql:get-unmigrated-files}
[{:keys [::db/conn] :as cfg} {:keys [id]} & {:as opts}]
(let [{:keys [id data created-at modified-at]}
(db/get* conn :file {:id id}
::db/for-update true
::db/remove-deleted false)]
(when data
(l/inf :hint "migrating file" :file-id (str id))
(db/update! conn :file {:data nil} {:id id} ::db/return-keys false)
(db/insert! conn :file-data
{:backend "db"
:metadata nil
:type "main"
:data data
:created-at created-at
:modified-at modified-at
:file-id id
:id id}
{::db/return-keys false}))
(let [snapshots-sql
(str "WITH snapshots AS (" sql:get-unmigrated-snapshots ") "
"SELECT s.* FROM snapshots AS s WHERE s.file_id = ?")]
(run! (fn [params]
(migrate-snapshot-to-storage cfg params opts))
(db/plan cfg [snapshots-sql id])))))
(defn rollback-file-from-storage
"Migrate back to the file table storage."
{:query sql:get-migrated-files}
[{:keys [::db/conn] :as cfg} {:keys [id]} & {:as opts}]
(when-let [{:keys [id data]}
(db/get* conn :file-data {:id id :file-id id :type "main"}
::db/for-update true
::db/remove-deleted false)]
(l/inf :hint "rollback file" :file-id (str id))
(db/update! conn :file {:data data} {:id id} ::db/return-keys false)
(db/delete! conn :file-data {:file-id id :id id :type "main"} ::db/return-keys false)
(let [snapshots-sql
(str "WITH snapshots AS (" sql:get-migrated-snapshots ") "
"SELECT s.* FROM snapshots AS s WHERE s.file_id = ?")]
(run! (fn [params]
(rollback-snapshot-from-storage cfg params opts))
(db/plan cfg [snapshots-sql id])))))

View File

@@ -0,0 +1,60 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.srepl.procs.file-repair
(:require
[app.common.files.changes :as cfc]
[app.common.files.repair :as cfr]
[app.common.files.validate :as cfv]
[app.common.logging :as l]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; GENERAL PURPOSE REPAIR
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn repair-file
"Internal helper for validate and repair the file. The operation is
applied multiple times untile file is fixed or max iteration counter
is reached (default 10).
This function should not be used directly, it is used throught the
app.srepl.main/repair-file! helper. In practical terms this function
is private and implementation detail."
[file libs & {:keys [max-iterations] :or {max-iterations 10}}]
(let [validate-and-repair
(fn [file libs iteration]
(when-let [errors (not-empty (cfv/validate-file file libs))]
(l/trc :hint "repairing file"
:file-id (str (:id file))
:iteration iteration
:errors (count errors))
(let [changes (cfr/repair-file file libs errors)]
(-> file
(update :revn inc)
(update :data cfc/process-changes changes)))))
process-file
(fn [file libs]
(loop [file file
iteration 0]
(if (< iteration max-iterations)
(if-let [file (validate-and-repair file libs iteration)]
(recur file (inc iteration))
file)
(do
(l/wrn :hint "max retry num reached on repairing file"
:file-id (str (:id file))
:iteration iteration)
file))))
file'
(process-file file libs)]
(when (not= (:revn file) (:revn file'))
(l/trc :hint "file repaired" :file-id (str (:id file))))
file'))

View File

@@ -4,10 +4,11 @@
;; ;;
;; Copyright (c) KALEIDOS INC ;; Copyright (c) KALEIDOS INC
(ns app.srepl.fixes.media-refs (ns app.srepl.procs.media-refs
(:require (:require
[app.binfile.common :as bfc] [app.binfile.common :as bfc]
[app.common.files.helpers :as cfh] [app.common.files.helpers :as cfh]
[app.common.logging :as l]
[app.srepl.helpers :as h])) [app.srepl.helpers :as h]))
(defn- collect-media-refs (defn- collect-media-refs
@@ -37,7 +38,22 @@
(let [media-refs (collect-media-refs (:data file))] (let [media-refs (collect-media-refs (:data file))]
(bfc/update-media-references! cfg file media-refs))) (bfc/update-media-references! cfg file media-refs)))
(defn process-file (def ^:private sql:get-files
[file _opts] "SELECT f.id
(let [system (h/get-current-system)] FROM file AS f
(update-all-media-references system file))) LEFT JOIN file_migration AS fm ON (fm.file_id = f.id AND fm.name = 'internal/procs/media-refs')
WHERE fm.name IS NULL
ORDER BY f.project_id")
(defn fix-media-refs
{:query sql:get-files}
[cfg {:keys [id]} & {:as options}]
(l/inf :hint "processing file" :id (str id))
(h/process-file! cfg id
(fn [file _opts]
(update-all-media-references cfg file))
(assoc options
::bfc/reset-migrations? true
::h/validate? false))
(h/mark-migrated! cfg id "internal/procs/media-refs"))

View File

@@ -0,0 +1,57 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns app.srepl.procs.path-data
(:require
[app.binfile.common :as bfc]
[app.common.data :as d]
[app.common.files.helpers :as cfh]
[app.common.logging :as l]
[app.srepl.helpers :as h]))
(def ^:private sql:get-files-with-path-data
"SELECT id FROM file WHERE features @> '{fdata/path-data}'")
(defn disable
"A script responsible for remove the path data type from file data and
allow file to be open in older penpot versions.
Should be used only in cases when you want to downgrade to an older
penpot version for some reason."
{:query sql:get-files-with-path-data}
[cfg {:keys [id]} & {:as options}]
(l/inf :hint "disabling path-data" :file-id (str id))
(let [update-object
(fn [object]
(if (or (cfh/path-shape? object)
(cfh/bool-shape? object))
(update object :content vec)
object))
update-container
(fn [container]
(d/update-when container :objects d/update-vals update-object))
update-file
(fn [file & _opts]
(-> file
(update :data (fn [data]
(-> data
(update :pages-index d/update-vals update-container)
(d/update-when :components d/update-vals update-container))))
(update :features disj "fdata/path-data")
(update :migrations disj
"0003-convert-path-content-v2"
"0003-convert-path-content")))
options
(-> options
(assoc ::bfc/reset-migrations? true)
(assoc ::h/validate? false))]
(h/process-file! cfg id update-file options)))

View File

@@ -27,7 +27,9 @@
(defn get-legacy-backend (defn get-legacy-backend
[] []
(let [name (cf/get :assets-storage-backend)] (when-let [name (cf/get :assets-storage-backend)]
(l/wrn :hint "using deprecated configuration, please read 2.11 release notes"
:href "https://github.com/penpot/penpot/releases/tag/2.11.0")
(case name (case name
:assets-fs :fs :assets-fs :fs
:assets-s3 :s3 :assets-s3 :s3
@@ -113,13 +115,10 @@
(defn- create-database-object (defn- create-database-object
[{:keys [::backend ::db/connectable]} {:keys [::content ::expired-at ::touched-at ::touch] :as params}] [{:keys [::backend ::db/connectable]} {:keys [::content ::expired-at ::touched-at ::touch] :as params}]
(let [id (or (:id params) (uuid/random)) (let [id (or (::id params) (uuid/random))
mdata (cond-> (get-metadata params) mdata (cond-> (get-metadata params)
(satisfies? impl/IContentHash content) (satisfies? impl/IContentHash content)
(assoc :hash (impl/get-hash content)) (assoc :hash (impl/get-hash content)))
:always
(dissoc :id))
touched-at (if touch touched-at (if touch
(or touched-at (ct/now)) (or touched-at (ct/now))

View File

@@ -34,7 +34,7 @@
(SELECT EXISTS (SELECT 1 FROM team_font_variant WHERE ttf_file_id = ?))) AS has_refs") (SELECT EXISTS (SELECT 1 FROM team_font_variant WHERE ttf_file_id = ?))) AS has_refs")
(defn- has-team-font-variant-refs? (defn- has-team-font-variant-refs?
[conn id] [conn {:keys [id]}]
(-> (db/exec-one! conn [sql:has-team-font-variant-refs id id id id]) (-> (db/exec-one! conn [sql:has-team-font-variant-refs id id id id])
(get :has-refs))) (get :has-refs)))
@@ -44,7 +44,7 @@
(SELECT EXISTS (SELECT 1 FROM file_media_object WHERE thumbnail_id = ?))) AS has_refs") (SELECT EXISTS (SELECT 1 FROM file_media_object WHERE thumbnail_id = ?))) AS has_refs")
(defn- has-file-media-object-refs? (defn- has-file-media-object-refs?
[conn id] [conn {:keys [id]}]
(-> (db/exec-one! conn [sql:has-file-media-object-refs id id]) (-> (db/exec-one! conn [sql:has-file-media-object-refs id id])
(get :has-refs))) (get :has-refs)))
@@ -53,7 +53,7 @@
(SELECT EXISTS (SELECT 1 FROM team WHERE photo_id = ?))) AS has_refs") (SELECT EXISTS (SELECT 1 FROM team WHERE photo_id = ?))) AS has_refs")
(defn- has-profile-refs? (defn- has-profile-refs?
[conn id] [conn {:keys [id]}]
(-> (db/exec-one! conn [sql:has-profile-refs id id]) (-> (db/exec-one! conn [sql:has-profile-refs id id])
(get :has-refs))) (get :has-refs)))
@@ -62,7 +62,7 @@
"SELECT EXISTS (SELECT 1 FROM file_tagged_object_thumbnail WHERE media_id = ?) AS has_refs") "SELECT EXISTS (SELECT 1 FROM file_tagged_object_thumbnail WHERE media_id = ?) AS has_refs")
(defn- has-file-object-thumbnails-refs? (defn- has-file-object-thumbnails-refs?
[conn id] [conn {:keys [id]}]
(-> (db/exec-one! conn [sql:has-file-object-thumbnail-refs id]) (-> (db/exec-one! conn [sql:has-file-object-thumbnail-refs id])
(get :has-refs))) (get :has-refs)))
@@ -71,36 +71,23 @@
"SELECT EXISTS (SELECT 1 FROM file_thumbnail WHERE media_id = ?) AS has_refs") "SELECT EXISTS (SELECT 1 FROM file_thumbnail WHERE media_id = ?) AS has_refs")
(defn- has-file-thumbnails-refs? (defn- has-file-thumbnails-refs?
[conn id] [conn {:keys [id]}]
(-> (db/exec-one! conn [sql:has-file-thumbnail-refs id]) (-> (db/exec-one! conn [sql:has-file-thumbnail-refs id])
(get :has-refs))) (get :has-refs)))
(def ^:private (def sql:exists-file-data-refs
sql:has-file-data-refs "SELECT EXISTS (
"SELECT EXISTS (SELECT 1 FROM file WHERE data_ref_id = ?) AS has_refs") SELECT 1 FROM file_data
WHERE file_id = ?
AND id = ?
AND metadata->>'storage-ref-id' = ?::text
) AS has_refs")
(defn- has-file-data-refs? (defn- has-file-data-refs?
[conn id] [conn sobject]
(-> (db/exec-one! conn [sql:has-file-data-refs id]) (let [{:keys [file-id id]} (:metadata sobject)]
(get :has-refs))) (-> (db/exec-one! conn [sql:exists-file-data-refs file-id id (:id sobject)])
(get :has-refs))))
(def ^:private
sql:has-file-data-fragment-refs
"SELECT EXISTS (SELECT 1 FROM file_data_fragment WHERE data_ref_id = ?) AS has_refs")
(defn- has-file-data-fragment-refs?
[conn id]
(-> (db/exec-one! conn [sql:has-file-data-fragment-refs id])
(get :has-refs)))
(def ^:private
sql:has-file-change-refs
"SELECT EXISTS (SELECT 1 FROM file_change WHERE data_ref_id = ?) AS has_refs")
(defn- has-file-change-refs?
[conn id]
(-> (db/exec-one! conn [sql:has-file-change-refs id])
(get :has-refs)))
(def ^:private sql:mark-freeze-in-bulk (def ^:private sql:mark-freeze-in-bulk
"UPDATE storage_object "UPDATE storage_object
@@ -143,52 +130,48 @@
"file-media-object")) "file-media-object"))
(defn- process-objects! (defn- process-objects!
[conn has-refs? ids bucket] [conn has-refs? bucket objects]
(loop [to-freeze #{} (loop [to-freeze #{}
to-delete #{} to-delete #{}
ids (seq ids)] objects (seq objects)]
(if-let [id (first ids)] (if-let [{:keys [id] :as object} (first objects)]
(if (has-refs? conn id) (if (has-refs? conn object)
(do (do
(l/debug :hint "processing object" (l/debug :id (str id)
:id (str id)
:status "freeze" :status "freeze"
:bucket bucket) :bucket bucket)
(recur (conj to-freeze id) to-delete (rest ids))) (recur (conj to-freeze id) to-delete (rest objects)))
(do (do
(l/debug :hint "processing object" (l/debug :id (str id)
:id (str id)
:status "delete" :status "delete"
:bucket bucket) :bucket bucket)
(recur to-freeze (conj to-delete id) (rest ids)))) (recur to-freeze (conj to-delete id) (rest objects))))
(do (do
(some->> (seq to-freeze) (mark-freeze-in-bulk! conn)) (some->> (seq to-freeze) (mark-freeze-in-bulk! conn))
(some->> (seq to-delete) (mark-delete-in-bulk! conn)) (some->> (seq to-delete) (mark-delete-in-bulk! conn))
[(count to-freeze) (count to-delete)])))) [(count to-freeze) (count to-delete)]))))
(defn- process-bucket! (defn- process-bucket!
[conn bucket ids] [conn bucket objects]
(case bucket (case bucket
"file-media-object" (process-objects! conn has-file-media-object-refs? ids bucket) "file-media-object" (process-objects! conn has-file-media-object-refs? bucket objects)
"team-font-variant" (process-objects! conn has-team-font-variant-refs? ids bucket) "team-font-variant" (process-objects! conn has-team-font-variant-refs? bucket objects)
"file-object-thumbnail" (process-objects! conn has-file-object-thumbnails-refs? ids bucket) "file-object-thumbnail" (process-objects! conn has-file-object-thumbnails-refs? bucket objects)
"file-thumbnail" (process-objects! conn has-file-thumbnails-refs? ids bucket) "file-thumbnail" (process-objects! conn has-file-thumbnails-refs? bucket objects)
"profile" (process-objects! conn has-profile-refs? ids bucket) "profile" (process-objects! conn has-profile-refs? bucket objects)
"file-data" (process-objects! conn has-file-data-refs? ids bucket) "file-data" (process-objects! conn has-file-data-refs? bucket objects)
"file-data-fragment" (process-objects! conn has-file-data-fragment-refs? ids bucket)
"file-change" (process-objects! conn has-file-change-refs? ids bucket)
(ex/raise :type :internal (ex/raise :type :internal
:code :unexpected-unknown-reference :code :unexpected-unknown-reference
:hint (dm/fmt "unknown reference '%'" bucket)))) :hint (dm/fmt "unknown reference '%'" bucket))))
(defn process-chunk! (defn process-chunk!
[{:keys [::db/conn]} chunk] [{:keys [::db/conn]} chunk]
(reduce-kv (fn [[nfo ndo] bucket ids] (reduce-kv (fn [[nfo ndo] bucket objects]
(let [[nfo' ndo'] (process-bucket! conn bucket ids)] (let [[nfo' ndo'] (process-bucket! conn bucket objects)]
[(+ nfo nfo') [(+ nfo nfo')
(+ ndo ndo')])) (+ ndo ndo')]))
[0 0] [0 0]
(d/group-by lookup-bucket :id #{} chunk))) (d/group-by lookup-bucket identity #{} chunk)))
(def ^:private (def ^:private
sql:get-touched-storage-objects sql:get-touched-storage-objects
@@ -214,12 +197,7 @@
(let [[nfo ndo] (db/tx-run! cfg process-chunk! chunk)] (let [[nfo ndo] (db/tx-run! cfg process-chunk! chunk)]
(recur (long (+ freezed nfo)) (recur (long (+ freezed nfo))
(long (+ deleted ndo)))) (long (+ deleted ndo))))
(do {:freeze freezed :delete deleted})))
(l/inf :hint "task finished"
:to-freeze freezed
:to-delete deleted)
{:freeze freezed :delete deleted}))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; HANDLER ;; HANDLER

View File

@@ -31,13 +31,13 @@
java.time.Duration java.time.Duration
java.util.Collection java.util.Collection
java.util.Optional java.util.Optional
java.util.concurrent.atomic.AtomicLong
org.reactivestreams.Subscriber org.reactivestreams.Subscriber
software.amazon.awssdk.core.ResponseBytes software.amazon.awssdk.core.ResponseBytes
software.amazon.awssdk.core.async.AsyncRequestBody software.amazon.awssdk.core.async.AsyncRequestBody
software.amazon.awssdk.core.async.AsyncResponseTransformer software.amazon.awssdk.core.async.AsyncResponseTransformer
software.amazon.awssdk.core.async.BlockingInputStreamAsyncRequestBody software.amazon.awssdk.core.async.BlockingInputStreamAsyncRequestBody
software.amazon.awssdk.core.client.config.ClientAsyncConfiguration software.amazon.awssdk.core.client.config.ClientAsyncConfiguration
software.amazon.awssdk.core.client.config.SdkAdvancedAsyncClientOption
software.amazon.awssdk.http.nio.netty.NettyNioAsyncHttpClient software.amazon.awssdk.http.nio.netty.NettyNioAsyncHttpClient
software.amazon.awssdk.http.nio.netty.SdkEventLoopGroup software.amazon.awssdk.http.nio.netty.SdkEventLoopGroup
software.amazon.awssdk.regions.Region software.amazon.awssdk.regions.Region
@@ -87,12 +87,11 @@
(def ^:private schema:config (def ^:private schema:config
[:map {:title "s3-backend-config"} [:map {:title "s3-backend-config"}
::wrk/executor ::wrk/netty-io-executor
[::region {:optional true} :keyword] [::region {:optional true} :keyword]
[::bucket {:optional true} ::sm/text] [::bucket {:optional true} ::sm/text]
[::prefix {:optional true} ::sm/text] [::prefix {:optional true} ::sm/text]
[::endpoint {:optional true} ::sm/uri] [::endpoint {:optional true} ::sm/uri]])
[::io-threads {:optional true} ::sm/int]])
(defmethod ig/expand-key ::backend (defmethod ig/expand-key ::backend
[k v] [k v]
@@ -110,6 +109,7 @@
presigner (build-s3-presigner params)] presigner (build-s3-presigner params)]
(assoc params (assoc params
::sto/type :s3 ::sto/type :s3
::counter (AtomicLong. 0)
::client @client ::client @client
::presigner presigner ::presigner presigner
::close-fn #(.close ^java.lang.AutoCloseable client))))) ::close-fn #(.close ^java.lang.AutoCloseable client)))))
@@ -121,7 +121,7 @@
(defmethod ig/halt-key! ::backend (defmethod ig/halt-key! ::backend
[_ {:keys [::close-fn]}] [_ {:keys [::close-fn]}]
(when (fn? close-fn) (when (fn? close-fn)
(px/run! close-fn))) (close-fn)))
(def ^:private schema:backend (def ^:private schema:backend
[:map {:title "s3-backend"} [:map {:title "s3-backend"}
@@ -198,19 +198,16 @@
(Region/of (name region))) (Region/of (name region)))
(defn- build-s3-client (defn- build-s3-client
[{:keys [::region ::endpoint ::io-threads ::wrk/executor]}] [{:keys [::region ::endpoint ::wrk/netty-io-executor]}]
(let [aconfig (-> (ClientAsyncConfiguration/builder) (let [aconfig (-> (ClientAsyncConfiguration/builder)
(.advancedOption SdkAdvancedAsyncClientOption/FUTURE_COMPLETION_EXECUTOR executor)
(.build)) (.build))
sconfig (-> (S3Configuration/builder) sconfig (-> (S3Configuration/builder)
(cond-> (some? endpoint) (.pathStyleAccessEnabled true)) (cond-> (some? endpoint) (.pathStyleAccessEnabled true))
(.build)) (.build))
thr-num (or io-threads (min 16 (px/get-available-processors)))
hclient (-> (NettyNioAsyncHttpClient/builder) hclient (-> (NettyNioAsyncHttpClient/builder)
(.eventLoopGroupBuilder (-> (SdkEventLoopGroup/builder) (.eventLoopGroup (SdkEventLoopGroup/create netty-io-executor))
(.numberOfThreads (int thr-num))))
(.connectionAcquisitionTimeout default-timeout) (.connectionAcquisitionTimeout default-timeout)
(.connectionTimeout default-timeout) (.connectionTimeout default-timeout)
(.readTimeout default-timeout) (.readTimeout default-timeout)
@@ -262,7 +259,7 @@
(.close ^InputStream input)))) (.close ^InputStream input))))
(defn- make-request-body (defn- make-request-body
[executor content] [counter content]
(let [size (impl/get-size content)] (let [size (impl/get-size content)]
(reify (reify
AsyncRequestBody AsyncRequestBody
@@ -272,16 +269,19 @@
(^void subscribe [_ ^Subscriber subscriber] (^void subscribe [_ ^Subscriber subscriber]
(let [delegate (AsyncRequestBody/forBlockingInputStream (long size)) (let [delegate (AsyncRequestBody/forBlockingInputStream (long size))
input (io/input-stream content)] input (io/input-stream content)]
(px/run! executor (partial write-input-stream delegate input))
(px/thread-call (partial write-input-stream delegate input)
{:name (str "penpot/storage/" (.getAndIncrement ^AtomicLong counter))})
(.subscribe ^BlockingInputStreamAsyncRequestBody delegate (.subscribe ^BlockingInputStreamAsyncRequestBody delegate
^Subscriber subscriber)))))) ^Subscriber subscriber))))))
(defn- put-object (defn- put-object
[{:keys [::client ::bucket ::prefix ::wrk/executor]} {:keys [id] :as object} content] [{:keys [::client ::bucket ::prefix ::counter]} {:keys [id] :as object} content]
(let [path (dm/str prefix (impl/id->path id)) (let [path (dm/str prefix (impl/id->path id))
mdata (meta object) mdata (meta object)
mtype (:content-type mdata "application/octet-stream") mtype (:content-type mdata "application/octet-stream")
rbody (make-request-body executor content) rbody (make-request-body counter content)
request (.. (PutObjectRequest/builder) request (.. (PutObjectRequest/builder)
(bucket bucket) (bucket bucket)
(contentType mtype) (contentType mtype)

View File

@@ -44,7 +44,7 @@
[_ cfg] [_ cfg]
(fs/create-dir default-tmp-dir) (fs/create-dir default-tmp-dir)
(px/fn->thread (partial io-loop cfg) (px/fn->thread (partial io-loop cfg)
{:name "penpot/storage/tmp-cleaner" :virtual true})) {:name "penpot/storage/tmp-cleaner"}))
(defmethod ig/halt-key! ::cleaner (defmethod ig/halt-key! ::cleaner
[_ thread] [_ thread]

View File

@@ -10,6 +10,7 @@
[app.common.logging :as l] [app.common.logging :as l]
[app.common.time :as ct] [app.common.time :as ct]
[app.db :as db] [app.db :as db]
[app.db.sql :as-alias sql]
[app.rpc.commands.files :as files] [app.rpc.commands.files :as files]
[app.rpc.commands.profile :as profile] [app.rpc.commands.profile :as profile]
[integrant.core :as ig])) [integrant.core :as ig]))
@@ -19,10 +20,28 @@
(defmulti delete-object (defmulti delete-object
(fn [_ props] (:object props))) (fn [_ props] (:object props)))
(defmethod delete-object :snapshot
[{:keys [::db/conn] :as cfg} {:keys [id file-id deleted-at]}]
(l/trc :obj "snapshot" :id (str id) :file-id (str file-id)
:deleted-at (ct/format-inst deleted-at))
(db/update! conn :file-change
{:deleted-at deleted-at}
{:id id :file-id file-id}
{::db/return-keys false})
(db/update! conn :file-data
{:deleted-at deleted-at}
{:id id :file-id file-id :type "snapshot"}
{::db/return-keys false}))
(defmethod delete-object :file (defmethod delete-object :file
[{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}] [{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}]
(when-let [file (db/get* conn :file {:id id} {::db/remove-deleted false})] (when-let [file (db/get* conn :file {:id id}
(l/trc :hint "marking for deletion" :rel "file" :id (str id) {::db/remove-deleted false
::sql/columns [:id :is-shared]})]
(l/trc :obj "file" :id (str id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
(db/update! conn :file (db/update! conn :file
@@ -43,25 +62,35 @@
;; Mark file change to be deleted ;; Mark file change to be deleted
(db/update! conn :file-change (db/update! conn :file-change
{:deleted-at deleted-at} {:deleted-at deleted-at}
{:file-id id}) {:file-id id}
{::db/return-keys false})
;; Mark file data fragment to be deleted
(db/update! conn :file-data
{:deleted-at deleted-at}
{:file-id id}
{::db/return-keys false})
;; Mark file media objects to be deleted ;; Mark file media objects to be deleted
(db/update! conn :file-media-object (db/update! conn :file-media-object
{:deleted-at deleted-at} {:deleted-at deleted-at}
{:file-id id}) {:file-id id}
{::db/return-keys false})
;; Mark thumbnails to be deleted ;; Mark thumbnails to be deleted
(db/update! conn :file-thumbnail (db/update! conn :file-thumbnail
{:deleted-at deleted-at} {:deleted-at deleted-at}
{:file-id id}) {:file-id id}
{::db/return-keys false})
(db/update! conn :file-tagged-object-thumbnail (db/update! conn :file-tagged-object-thumbnail
{:deleted-at deleted-at} {:deleted-at deleted-at}
{:file-id id}))) {:file-id id}
{::db/return-keys false})))
(defmethod delete-object :project (defmethod delete-object :project
[{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}] [{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}]
(l/trc :hint "marking for deletion" :rel "project" :id (str id) (l/trc :obj "project" :id (str id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
(db/update! conn :project (db/update! conn :project
@@ -78,7 +107,7 @@
(defmethod delete-object :team (defmethod delete-object :team
[{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}] [{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}]
(l/trc :hint "marking for deletion" :rel "team" :id (str id) (l/trc :obj "team" :id (str id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
(db/update! conn :team (db/update! conn :team
{:deleted-at deleted-at} {:deleted-at deleted-at}
@@ -100,7 +129,7 @@
(defmethod delete-object :profile (defmethod delete-object :profile
[{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}] [{:keys [::db/conn] :as cfg} {:keys [id deleted-at]}]
(l/trc :hint "marking for deletion" :rel "profile" :id (str id) (l/trc :obj "profile" :id (str id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
(db/update! conn :profile (db/update! conn :profile
@@ -115,7 +144,7 @@
(defmethod delete-object :default (defmethod delete-object :default
[_cfg props] [_cfg props]
(l/wrn :hint "not implementation found" :rel (:object props))) (l/wrn :obj (:object props) :hint "not implementation found"))
(defmethod ig/assert-key ::handler (defmethod ig/assert-key ::handler
[_ params] [_ params]

View File

@@ -23,29 +23,16 @@
[app.config :as cf] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.features.fdata :as feat.fdata] [app.features.fdata :as feat.fdata]
[app.features.file-snapshots :as fsnap]
[app.storage :as sto] [app.storage :as sto]
[app.worker :as wrk] [app.worker :as wrk]
[integrant.core :as ig])) [integrant.core :as ig]))
(declare get-file) (declare get-file)
(def sql:get-snapshots
"SELECT fc.file_id AS id,
fc.id AS snapshot_id,
fc.data,
fc.revn,
fc.version,
fc.features,
fc.data_backend,
fc.data_ref_id
FROM file_change AS fc
WHERE fc.file_id = ?
AND fc.data IS NOT NULL
ORDER BY fc.created_at ASC")
(def ^:private sql:mark-file-media-object-deleted (def ^:private sql:mark-file-media-object-deleted
"UPDATE file_media_object "UPDATE file_media_object
SET deleted_at = now() SET deleted_at = ?
WHERE file_id = ? AND id != ALL(?::uuid[]) WHERE file_id = ? AND id != ALL(?::uuid[])
RETURNING id") RETURNING id")
@@ -56,37 +43,35 @@
(defn- clean-file-media! (defn- clean-file-media!
"Performs the garbage collection of file media objects." "Performs the garbage collection of file media objects."
[{:keys [::db/conn] :as cfg} {:keys [id] :as file}] [{:keys [::db/conn ::timestamp] :as cfg} {:keys [id] :as file}]
(let [xform (comp (let [used-media
(map (partial bfc/decode-file cfg)) (fsnap/reduce-snapshots cfg id xf:collect-used-media conj #{})
xf:collect-used-media)
used (->> (db/plan conn [sql:get-snapshots id] {:fetch-size 1}) used-media
(transduce xform conj #{})) (into used-media xf:collect-used-media [file])
used (into used xf:collect-used-media [file])
ids (db/create-array conn "uuid" used) used-media
unused (->> (db/exec! conn [sql:mark-file-media-object-deleted id ids]) (db/create-array conn "uuid" used-media)
unused-media
(->> (db/exec! conn [sql:mark-file-media-object-deleted timestamp id used-media])
(into #{} (map :id)))] (into #{} (map :id)))]
(l/dbg :hint "clean" :rel "file-media-object" :file-id (str id) :total (count unused)) (doseq [id unused-media]
(l/trc :obj "media-object"
(doseq [id unused] :file-id (str id)
(l/trc :hint "mark deleted" :id (str id)))
:rel "file-media-object"
:id (str id)
:file-id (str id)))
file)) file))
(def ^:private sql:mark-file-object-thumbnails-deleted (def ^:private sql:mark-file-object-thumbnails-deleted
"UPDATE file_tagged_object_thumbnail "UPDATE file_tagged_object_thumbnail
SET deleted_at = now() SET deleted_at = ?
WHERE file_id = ? AND object_id != ALL(?::text[]) WHERE file_id = ? AND object_id != ALL(?::text[])
RETURNING object_id") RETURNING object_id")
(defn- clean-file-object-thumbnails! (defn- clean-file-object-thumbnails!
[{:keys [::db/conn]} {:keys [data] :as file}] [{:keys [::db/conn ::timestamp]} {:keys [data] :as file}]
(let [file-id (:id file) (let [file-id (:id file)
using (->> (vals (:pages-index data)) using (->> (vals (:pages-index data))
(into #{} (comp (into #{} (comp
@@ -98,49 +83,37 @@
(thc/fmt-object-id file-id page-id id "frame") (thc/fmt-object-id file-id page-id id "frame")
(thc/fmt-object-id file-id page-id id "component"))))))) (thc/fmt-object-id file-id page-id id "component")))))))
ids (db/create-array conn "text" using) ids (into-array String using)
unused (->> (db/exec! conn [sql:mark-file-object-thumbnails-deleted file-id ids]) unused (->> (db/exec! conn [sql:mark-file-object-thumbnails-deleted timestamp file-id ids])
(into #{} (map :object-id)))] (into #{} (map :object-id)))]
(l/dbg :hint "clean" :rel "file-object-thumbnail" :file-id (str file-id) :total (count unused))
(doseq [object-id unused] (doseq [object-id unused]
(l/trc :hint "mark deleted" (l/trc :obj "object-thumbnail"
:rel "file-tagged-object-thumbnail" :file-id (str file-id)
:object-id object-id :id object-id))
:file-id (str file-id)))
file)) file))
(def ^:private sql:mark-file-thumbnails-deleted (def ^:private sql:mark-file-thumbnails-deleted
"UPDATE file_thumbnail "UPDATE file_thumbnail
SET deleted_at = now() SET deleted_at = ?
WHERE file_id = ? AND revn < ? WHERE file_id = ? AND revn < ?
RETURNING revn") RETURNING revn")
(defn- clean-file-thumbnails! (defn- clean-file-thumbnails!
[{:keys [::db/conn]} {:keys [id revn] :as file}] [{:keys [::db/conn ::timestamp]} {:keys [id revn] :as file}]
(let [unused (->> (db/exec! conn [sql:mark-file-thumbnails-deleted id revn]) (let [unused (->> (db/exec! conn [sql:mark-file-thumbnails-deleted timestamp id revn])
(into #{} (map :revn)))] (into #{} (map :revn)))]
(l/dbg :hint "clean" :rel "file-thumbnail" :file-id (str id) :total (count unused))
(doseq [revn unused] (doseq [revn unused]
(l/trc :hint "mark deleted" (l/trc :obj "thumbnail"
:rel "file-thumbnail" :file-id (str id)
:revn revn :revn revn))
:file-id (str id)))
file)) file))
(def ^:private sql:get-files-for-library (def ^:private sql:get-files-for-library
"SELECT f.id, "SELECT f.id
f.data,
f.modified_at,
f.features,
f.version,
f.data_backend,
f.data_ref_id
FROM file AS f FROM file AS f
LEFT JOIN file_library_rel AS fl ON (fl.file_id = f.id) LEFT JOIN file_library_rel AS fl ON (fl.file_id = f.id)
WHERE fl.library_file_id = ? WHERE fl.library_file_id = ?
@@ -161,15 +134,21 @@
deleted-components deleted-components
(ctkl/deleted-components-seq data) (ctkl/deleted-components-seq data)
xform file-xform
(mapcat (partial get-used-components deleted-components file-id)) (mapcat (partial get-used-components deleted-components file-id))
library-xform
(comp
(map :id)
(map #(bfc/get-file cfg % :realize? true :read-only? true))
file-xform)
used-remote used-remote
(->> (db/plan conn [sql:get-files-for-library file-id] {:fetch-size 1}) (->> (db/plan conn [sql:get-files-for-library file-id] {:fetch-size 1})
(transduce (comp (map (partial bfc/decode-file cfg)) xform) conj #{})) (transduce library-xform conj #{}))
used-local used-local
(into #{} xform [file]) (into #{} file-xform [file])
unused unused
(transduce bfc/xf-map-id disj (transduce bfc/xf-map-id disj
@@ -180,21 +159,21 @@
(update file :data (update file :data
(fn [data] (fn [data]
(reduce (fn [data id] (reduce (fn [data id]
(l/trc :hint "delete component" (l/trc :obj "component"
:component-id (str id) :file-id (str file-id)
:file-id (str file-id)) :id (str id))
(ctkl/delete-component data id)) (ctkl/delete-component data id))
data data
unused)))] unused)))]
(l/dbg :hint "clean" :rel "components" :file-id (str file-id) :total (count unused))
file)) file))
(def ^:private sql:mark-deleted-data-fragments (def ^:private sql:mark-deleted-data-fragments
"UPDATE file_data_fragment "UPDATE file_data
SET deleted_at = now() SET deleted_at = ?
WHERE file_id = ? WHERE file_id = ?
AND id != ALL(?::uuid[]) AND id != ALL(?::uuid[])
AND type = 'fragment'
AND deleted_at IS NULL AND deleted_at IS NULL
RETURNING id") RETURNING id")
@@ -203,19 +182,16 @@
(mapcat feat.fdata/get-used-pointer-ids))) (mapcat feat.fdata/get-used-pointer-ids)))
(defn- clean-fragments! (defn- clean-fragments!
[{:keys [::db/conn]} {:keys [id] :as file}] [{:keys [::db/conn ::timestamp]} {:keys [id] :as file}]
(let [used (into #{} xf:collect-pointers [file]) (let [used (into #{} xf:collect-pointers [file])
unused (->> (db/exec! conn [sql:mark-deleted-data-fragments timestamp id
unused (->> (db/exec! conn [sql:mark-deleted-data-fragments id
(db/create-array conn "uuid" used)]) (db/create-array conn "uuid" used)])
(into #{} bfc/xf-map-id))] (into #{} bfc/xf-map-id))]
(l/dbg :hint "clean" :rel "file-data-fragment" :file-id (str id) :total (count unused))
(doseq [id unused] (doseq [id unused]
(l/trc :hint "mark deleted" (l/trc :obj "fragment"
:rel "file-data-fragment" :file-id (str id)
:id (str id) :id (str id)))
:file-id (str id)))
file)) file))
@@ -229,36 +205,23 @@
(cfv/validate-file-schema! file) (cfv/validate-file-schema! file)
file)) file))
(def ^:private sql:get-file
"SELECT f.id,
f.data,
f.revn,
f.version,
f.features,
f.modified_at,
f.data_backend,
f.data_ref_id
FROM file AS f
WHERE f.has_media_trimmed IS false
AND f.modified_at < now() - ?::interval
AND f.deleted_at IS NULL
AND f.id = ?
FOR UPDATE
SKIP LOCKED")
(defn get-file (defn get-file
[{:keys [::db/conn ::min-age]} file-id] [cfg {:keys [file-id revn]}]
(let [min-age (if min-age (let [file (bfc/get-file cfg file-id
(db/interval min-age) :realize? true
(db/interval 0))] :skip-locked? true
(->> (db/exec! conn [sql:get-file min-age file-id]) :lock-for-update? true)]
(first))))
;; We should ensure that the scheduled file and the procesing file
;; has not changed since schedule, for this reason we check the
;; revn from props with the revn from retrieved file from database
(when (or (nil? revn) (= revn (:revn file)))
file)))
(defn- process-file! (defn- process-file!
[cfg file-id] [cfg {:keys [file-id] :as props}]
(if-let [file (get-file cfg file-id)] (if-let [file (get-file cfg props)]
(let [file (->> file (let [file (->> file
(bfc/decode-file cfg)
(bfl/clean-file) (bfl/clean-file)
(clean-media! cfg) (clean-media! cfg)
(clean-fragments! cfg)) (clean-fragments! cfg))
@@ -267,7 +230,7 @@
true) true)
(do (do
(l/dbg :hint "skip" :file-id (str file-id)) (l/dbg :hint "skip cleaning, criteria does not match" :file-id (str file-id))
false))) false)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
@@ -282,26 +245,23 @@
(defmethod ig/init-key ::handler (defmethod ig/init-key ::handler
[_ cfg] [_ cfg]
(fn [{:keys [props] :as task}] (fn [{:keys [props] :as task}]
(let [min-age (ct/duration (or (:min-age props)
(cf/get-deletion-delay)))
file-id (get props :file-id)
cfg (-> cfg
(assoc ::db/rollback (:rollback? props))
(assoc ::min-age min-age))]
(try (try
(db/tx-run! cfg (fn [{:keys [::db/conn] :as cfg}] (-> cfg
(let [cfg (update cfg ::sto/storage sto/configure conn) (assoc ::db/rollback (:rollback? props))
processed? (process-file! cfg file-id)] (db/tx-run! (fn [{:keys [::db/conn] :as cfg}]
(let [cfg (-> cfg
(update ::sto/storage sto/configure conn)
(assoc ::timestamp (ct/now)))
processed? (process-file! cfg props)]
(when (and processed? (contains? cf/flags :tiered-file-data-storage)) (when (and processed? (contains? cf/flags :tiered-file-data-storage))
(wrk/submit! (-> cfg (wrk/submit! (-> cfg
(assoc ::wrk/task :offload-file-data) (assoc ::wrk/task :offload-file-data)
(assoc ::wrk/params props) (assoc ::wrk/params props)
(assoc ::wrk/priority 10) (assoc ::wrk/priority 10)
(assoc ::wrk/delay 1000)))) (assoc ::wrk/delay 1000))))
processed?))) processed?))))
(catch Throwable cause (catch Throwable cause
(l/err :hint "error on cleaning file" (l/err :hint "error on cleaning file"
:file-id (str (:file-id props)) :file-id (str (:file-id props))
:cause cause)))))) :cause cause)))))

View File

@@ -17,29 +17,29 @@
(def ^:private (def ^:private
sql:get-candidates sql:get-candidates
"SELECT f.id, "SELECT f.id,
f.revn,
f.modified_at f.modified_at
FROM file AS f FROM file AS f
WHERE f.has_media_trimmed IS false WHERE f.has_media_trimmed IS false
AND f.modified_at < now() - ?::interval AND f.modified_at < now() - ?::interval
AND f.deleted_at IS NULL AND f.deleted_at IS NULL
ORDER BY f.modified_at DESC ORDER BY f.modified_at DESC
FOR UPDATE FOR UPDATE OF f
SKIP LOCKED") SKIP LOCKED")
(defn- get-candidates (defn- get-candidates
[{:keys [::db/conn ::min-age] :as cfg}] [{:keys [::db/conn ::min-age] :as cfg}]
(let [min-age (db/interval min-age)] (let [min-age (db/interval min-age)]
(db/cursor conn [sql:get-candidates min-age] {:chunk-size 10}))) (db/plan conn [sql:get-candidates min-age] {:fetch-size 10})))
(defn- schedule! (defn- schedule!
[{:keys [::min-age] :as cfg}] [cfg]
(let [total (reduce (fn [total {:keys [id]}] (let [total (reduce (fn [total {:keys [id modified-at revn]}]
(let [params {:file-id id :min-age min-age}] (let [params {:file-id id :modified-at modified-at :revn revn}]
(wrk/submit! (assoc cfg ::wrk/params params)) (wrk/submit! (assoc cfg ::wrk/params params))
(inc total))) (inc total)))
0 0
(get-candidates cfg))] (get-candidates cfg))]
{:processed total})) {:processed total}))
(defmethod ig/assert-key ::handler (defmethod ig/assert-key ::handler
@@ -48,7 +48,7 @@
(defmethod ig/expand-key ::handler (defmethod ig/expand-key ::handler
[k v] [k v]
{k (assoc v ::min-age (cf/get-deletion-delay))}) {k (assoc v ::min-age (cf/get-file-clean-delay))})
(defmethod ig/init-key ::handler (defmethod ig/init-key ::handler
[_ cfg] [_ cfg]

View File

@@ -11,6 +11,7 @@
[app.common.logging :as l] [app.common.logging :as l]
[app.common.time :as ct] [app.common.time :as ct]
[app.db :as db] [app.db :as db]
[app.features.fdata :as fdata]
[app.storage :as sto] [app.storage :as sto]
[integrant.core :as ig])) [integrant.core :as ig]))
@@ -27,14 +28,14 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}]
(->> (db/plan conn [sql:get-profiles deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-profiles deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id photo-id]}] (reduce (fn [total {:keys [id photo-id]}]
(l/trc :hint "permanently delete" :rel "profile" :id (str id)) (l/trc :obj "profile" :id (str id))
;; Mark as deleted the storage object ;; Mark as deleted the storage object
(some->> photo-id (sto/touch-object! storage)) (some->> photo-id (sto/touch-object! storage))
(db/delete! conn :profile {:id id}) (let [affected (-> (db/delete! conn :profile {:id id})
(db/get-update-count))]
(inc total)) (+ total affected)))
0))) 0)))
(def ^:private sql:get-teams (def ^:private sql:get-teams
@@ -50,8 +51,7 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}]
(->> (db/plan conn [sql:get-teams deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-teams deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id photo-id deleted-at]}] (reduce (fn [total {:keys [id photo-id deleted-at]}]
(l/trc :hint "permanently delete" (l/trc :obj "team"
:rel "team"
:id (str id) :id (str id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
@@ -59,9 +59,9 @@
(some->> photo-id (sto/touch-object! storage)) (some->> photo-id (sto/touch-object! storage))
;; And finally, permanently delete the team. ;; And finally, permanently delete the team.
(db/delete! conn :team {:id id}) (let [affected (-> (db/delete! conn :team {:id id})
(db/get-update-count))]
(inc total)) (+ total affected)))
0))) 0)))
(def ^:private sql:get-fonts (def ^:private sql:get-fonts
@@ -78,8 +78,7 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}]
(->> (db/plan conn [sql:get-fonts deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-fonts deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id team-id deleted-at] :as font}] (reduce (fn [total {:keys [id team-id deleted-at] :as font}]
(l/trc :hint "permanently delete" (l/trc :obj "font-variant"
:rel "team-font-variant"
:id (str id) :id (str id)
:team-id (str team-id) :team-id (str team-id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
@@ -90,10 +89,9 @@
(some->> (:otf-file-id font) (sto/touch-object! storage)) (some->> (:otf-file-id font) (sto/touch-object! storage))
(some->> (:ttf-file-id font) (sto/touch-object! storage)) (some->> (:ttf-file-id font) (sto/touch-object! storage))
;; And finally, permanently delete the team font variant (let [affected (-> (db/delete! conn :team-font-variant {:id id})
(db/delete! conn :team-font-variant {:id id}) (db/get-update-count))]
(+ total affected)))
(inc total))
0))) 0)))
(def ^:private sql:get-projects (def ^:private sql:get-projects
@@ -110,45 +108,40 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size] :as cfg}]
(->> (db/plan conn [sql:get-projects deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-projects deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id team-id deleted-at]}] (reduce (fn [total {:keys [id team-id deleted-at]}]
(l/trc :hint "permanently delete" (l/trc :obj "project"
:rel "project"
:id (str id) :id (str id)
:team-id (str team-id) :team-id (str team-id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
;; And finally, permanently delete the project. (let [affected (-> (db/delete! conn :project {:id id})
(db/delete! conn :project {:id id}) (db/get-update-count))]
(+ total affected)))
(inc total))
0))) 0)))
(def ^:private sql:get-files (def ^:private sql:get-files
"SELECT id, deleted_at, project_id, data_backend, data_ref_id "SELECT f.id,
FROM file f.deleted_at,
WHERE deleted_at IS NOT NULL f.project_id
AND deleted_at < now() + ?::interval FROM file AS f
ORDER BY deleted_at ASC WHERE f.deleted_at IS NOT NULL
AND f.deleted_at < now() + ?::interval
ORDER BY f.deleted_at ASC
LIMIT ? LIMIT ?
FOR UPDATE FOR UPDATE
SKIP LOCKED") SKIP LOCKED")
(defn- delete-files! (defn- delete-files!
[{:keys [::db/conn ::sto/storage ::deletion-threshold ::chunk-size] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size] :as cfg}]
(->> (db/plan conn [sql:get-files deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-files deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id deleted-at project-id] :as file}] (reduce (fn [total {:keys [id deleted-at project-id] :as file}]
(l/trc :hint "permanently delete" (l/trc :obj "file"
:rel "file"
:id (str id) :id (str id)
:project-id (str project-id) :project-id (str project-id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
(when (= "objects-storage" (:data-backend file)) (let [affected (-> (db/delete! conn :file {:id id})
(sto/touch-object! storage (:data-ref-id file))) (db/get-update-count))]
(+ total affected)))
;; And finally, permanently delete the file.
(db/delete! conn :file {:id id})
(inc total))
0))) 0)))
(def ^:private sql:get-file-thumbnails (def ^:private sql:get-file-thumbnails
@@ -165,8 +158,7 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}]
(->> (db/plan conn [sql:get-file-thumbnails deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-file-thumbnails deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [file-id revn media-id deleted-at]}] (reduce (fn [total {:keys [file-id revn media-id deleted-at]}]
(l/trc :hint "permanently delete" (l/trc :obj "file-thumbnail"
:rel "file-thumbnail"
:file-id (str file-id) :file-id (str file-id)
:revn revn :revn revn
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
@@ -174,10 +166,9 @@
;; Mark as deleted the storage object ;; Mark as deleted the storage object
(some->> media-id (sto/touch-object! storage)) (some->> media-id (sto/touch-object! storage))
;; And finally, permanently delete the object (let [affected (-> (db/delete! conn :file-thumbnail {:file-id file-id :revn revn})
(db/delete! conn :file-thumbnail {:file-id file-id :revn revn}) (db/get-update-count))]
(+ total affected)))
(inc total))
0))) 0)))
(def ^:private sql:get-file-object-thumbnails (def ^:private sql:get-file-object-thumbnails
@@ -194,8 +185,7 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}]
(->> (db/plan conn [sql:get-file-object-thumbnails deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-file-object-thumbnails deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [file-id object-id media-id deleted-at]}] (reduce (fn [total {:keys [file-id object-id media-id deleted-at]}]
(l/trc :hint "permanently delete" (l/trc :obj "file-object-thumbnail"
:rel "file-tagged-object-thumbnail"
:file-id (str file-id) :file-id (str file-id)
:object-id object-id :object-id object-id
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
@@ -203,36 +193,10 @@
;; Mark as deleted the storage object ;; Mark as deleted the storage object
(some->> media-id (sto/touch-object! storage)) (some->> media-id (sto/touch-object! storage))
;; And finally, permanently delete the object (let [affected (-> (db/delete! conn :file-tagged-object-thumbnail
(db/delete! conn :file-tagged-object-thumbnail {:file-id file-id :object-id object-id}) {:file-id file-id :object-id object-id})
(db/get-update-count))]
(inc total)) (+ total affected)))
0)))
(def ^:private sql:get-file-data-fragments
"SELECT file_id, id, deleted_at, data_ref_id
FROM file_data_fragment
WHERE deleted_at IS NOT NULL
AND deleted_at < now() + ?::interval
ORDER BY deleted_at ASC
LIMIT ?
FOR UPDATE
SKIP LOCKED")
(defn- delete-file-data-fragments!
[{:keys [::db/conn ::sto/storage ::deletion-threshold ::chunk-size] :as cfg}]
(->> (db/plan conn [sql:get-file-data-fragments deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [file-id id deleted-at data-ref-id]}]
(l/trc :hint "permanently delete"
:rel "file-data-fragment"
:id (str id)
:file-id (str file-id)
:deleted-at (ct/format-inst deleted-at))
(some->> data-ref-id (sto/touch-object! storage))
(db/delete! conn :file-data-fragment {:file-id file-id :id id})
(inc total))
0))) 0)))
(def ^:private sql:get-file-media-objects (def ^:private sql:get-file-media-objects
@@ -249,8 +213,7 @@
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}]
(->> (db/plan conn [sql:get-file-media-objects deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-file-media-objects deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id file-id deleted-at] :as fmo}] (reduce (fn [total {:keys [id file-id deleted-at] :as fmo}]
(l/trc :hint "permanently delete" (l/trc :obj "file-media-object"
:rel "file-media-object"
:id (str id) :id (str id)
:file-id (str file-id) :file-id (str file-id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
@@ -259,13 +222,48 @@
(some->> (:media-id fmo) (sto/touch-object! storage)) (some->> (:media-id fmo) (sto/touch-object! storage))
(some->> (:thumbnail-id fmo) (sto/touch-object! storage)) (some->> (:thumbnail-id fmo) (sto/touch-object! storage))
(db/delete! conn :file-media-object {:id id}) (let [affected (-> (db/delete! conn :file-media-object {:id id})
(db/get-update-count))]
(+ total affected)))
0)))
(inc total)) (def ^:private sql:get-file-data
"SELECT file_id, id, type, deleted_at, metadata, backend
FROM file_data
WHERE deleted_at IS NOT NULL
AND deleted_at < now() + ?::interval
ORDER BY deleted_at ASC
LIMIT ?
FOR UPDATE
SKIP LOCKED")
(defn- delete-file-data!
[{:keys [::db/conn ::deletion-threshold ::chunk-size] :as cfg}]
(->> (db/plan conn [sql:get-file-data deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [file-id id type deleted-at metadata backend]}]
(some->> metadata
(fdata/decode-metadata)
(fdata/process-metadata cfg))
(l/trc :obj "file-data"
:id (str id)
:file-id (str file-id)
:type type
:backend backend
:deleted-at (ct/format-inst deleted-at))
(let [affected (-> (db/delete! conn :file-data
{:file-id file-id
:id id
:type type})
(db/get-update-count))]
(+ total affected)))
0))) 0)))
(def ^:private sql:get-file-change (def ^:private sql:get-file-change
"SELECT id, file_id, deleted_at, data_backend, data_ref_id "SELECT id, file_id, deleted_at
FROM file_change FROM file_change
WHERE deleted_at IS NOT NULL WHERE deleted_at IS NOT NULL
AND deleted_at < now() + ?::interval AND deleted_at < now() + ?::interval
@@ -275,29 +273,25 @@
SKIP LOCKED") SKIP LOCKED")
(defn- delete-file-changes! (defn- delete-file-changes!
[{:keys [::db/conn ::deletion-threshold ::chunk-size ::sto/storage] :as cfg}] [{:keys [::db/conn ::deletion-threshold ::chunk-size] :as cfg}]
(->> (db/plan conn [sql:get-file-change deletion-threshold chunk-size] {:fetch-size 5}) (->> (db/plan conn [sql:get-file-change deletion-threshold chunk-size] {:fetch-size 5})
(reduce (fn [total {:keys [id file-id deleted-at] :as xlog}] (reduce (fn [total {:keys [id file-id deleted-at] :as xlog}]
(l/trc :hint "permanently delete" (l/trc :obj "file-change"
:rel "file-change"
:id (str id) :id (str id)
:file-id (str file-id) :file-id (str file-id)
:deleted-at (ct/format-inst deleted-at)) :deleted-at (ct/format-inst deleted-at))
(when (= "objects-storage" (:data-backend xlog)) (let [affected (-> (db/delete! conn :file-change {:id id})
(sto/touch-object! storage (:data-ref-id xlog))) (db/get-update-count))]
(+ total affected)))
(db/delete! conn :file-change {:id id})
(inc total))
0))) 0)))
(def ^:private deletion-proc-vars (def ^:private deletion-proc-vars
[#'delete-profiles! [#'delete-profiles!
#'delete-file-media-objects! #'delete-file-media-objects!
#'delete-file-data-fragments!
#'delete-file-object-thumbnails! #'delete-file-object-thumbnails!
#'delete-file-thumbnails! #'delete-file-thumbnails!
#'delete-file-data!
#'delete-file-changes! #'delete-file-changes!
#'delete-files! #'delete-files!
#'delete-projects! #'delete-projects!
@@ -309,7 +303,8 @@
until 0 results is returned" until 0 results is returned"
[cfg proc-fn] [cfg proc-fn]
(loop [total 0] (loop [total 0]
(let [result (db/tx-run! cfg (fn [{:keys [::db/conn] :as cfg}] (let [result (db/tx-run! cfg
(fn [{:keys [::db/conn] :as cfg}]
(db/exec-one! conn ["SET LOCAL rules.deletion_protection TO off"]) (db/exec-one! conn ["SET LOCAL rules.deletion_protection TO off"])
(proc-fn cfg)))] (proc-fn cfg)))]
(if (pos? result) (if (pos? result)
@@ -336,6 +331,4 @@
(let [result (execute-proc! cfg proc-fn)] (let [result (execute-proc! cfg proc-fn)]
(recur (rest procs) (recur (rest procs)
(long (+ total result)))) (long (+ total result))))
(do {:processed total})))))
(l/inf :hint "task finished" :deleted total)
{:processed total}))))))

View File

@@ -8,101 +8,25 @@
"A maintenance task responsible of moving file data from hot "A maintenance task responsible of moving file data from hot
storage (the database row) to a cold storage (fs or s3)." storage (the database row) to a cold storage (fs or s3)."
(:require (:require
[app.common.exceptions :as ex]
[app.common.logging :as l] [app.common.logging :as l]
[app.db :as db] [app.db :as db]
[app.db.sql :as-alias sql] [app.features.fdata :as fdata]
[app.storage :as sto] [app.storage :as sto]
[integrant.core :as ig])) [integrant.core :as ig]))
(defn- offload-file-data! (def ^:private sql:get-file-data
[{:keys [::db/conn ::sto/storage ::file-id] :as cfg}] "SELECT fd.*
(let [file (db/get conn :file {:id file-id} FROM file_data AS fd
{::sql/for-update true})] WHERE fd.file_id = ?
(when (nil? (:data file)) AND fd.backend = 'db'
(ex/raise :hint "file already offloaded" AND fd.deleted_at IS NULL")
:type :internal
:code :file-already-offloaded
:file-id file-id))
(let [data (sto/content (:data file)) (defn- offload-file-data
sobj (sto/put-object! storage [cfg {:keys [id file-id type] :as fdata}]
{::sto/content data (fdata/upsert! cfg (assoc fdata :backend "storage"))
::sto/touch true (l/trc :file-id (str file-id)
:bucket "file-data" :id (str id)
:content-type "application/octet-stream" :type type))
:file-id file-id})]
(l/trc :hint "offload file data"
:file-id (str file-id)
:storage-id (str (:id sobj)))
(db/update! conn :file
{:data-backend "objects-storage"
:data-ref-id (:id sobj)
:data nil}
{:id file-id}
{::db/return-keys false}))))
(defn- offload-file-data-fragments!
[{:keys [::db/conn ::sto/storage ::file-id] :as cfg}]
(doseq [fragment (db/query conn :file-data-fragment
{:file-id file-id
:deleted-at nil
:data-backend nil}
{::db/for-update true})]
(let [data (sto/content (:data fragment))
sobj (sto/put-object! storage
{::sto/content data
::sto/touch true
:bucket "file-data-fragment"
:content-type "application/octet-stream"
:file-id file-id
:file-fragment-id (:id fragment)})]
(l/trc :hint "offload file data fragment"
:file-id (str file-id)
:file-fragment-id (str (:id fragment))
:storage-id (str (:id sobj)))
(db/update! conn :file-data-fragment
{:data-backend "objects-storage"
:data-ref-id (:id sobj)
:data nil}
{:id (:id fragment)}
{::db/return-keys false}))))
(def sql:get-snapshots
"SELECT fc.*
FROM file_change AS fc
WHERE fc.file_id = ?
AND fc.label IS NOT NULL
AND fc.data IS NOT NULL
AND fc.data_backend IS NULL")
(defn- offload-file-snapshots!
[{:keys [::db/conn ::sto/storage ::file-id] :as cfg}]
(doseq [snapshot (db/exec! conn [sql:get-snapshots file-id])]
(let [data (sto/content (:data snapshot))
sobj (sto/put-object! storage
{::sto/content data
::sto/touch true
:bucket "file-change"
:content-type "application/octet-stream"
:file-id file-id
:file-change-id (:id snapshot)})]
(l/trc :hint "offload file change"
:file-id (str file-id)
:file-change-id (str (:id snapshot))
:storage-id (str (:id sobj)))
(db/update! conn :file-change
{:data-backend "objects-storage"
:data-ref-id (:id sobj)
:data nil}
{:id (:id snapshot)}
{::db/return-keys false}))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; HANDLER ;; HANDLER
@@ -116,10 +40,9 @@
(defmethod ig/init-key ::handler (defmethod ig/init-key ::handler
[_ cfg] [_ cfg]
(fn [{:keys [props] :as task}] (fn [{:keys [props] :as task}]
(let [file-id (:file-id props)]
(-> cfg (-> cfg
(assoc ::db/rollback (:rollback? props)) (assoc ::db/rollback (:rollback? props))
(assoc ::file-id (:file-id props)) (db/tx-run! (fn [{:keys [::db/conn] :as cfg}]
(db/tx-run! (fn [cfg] (run! (partial offload-file-data cfg)
(offload-file-data! cfg) (db/plan conn [sql:get-file-data file-id]))))))))
(offload-file-data-fragments! cfg)
(offload-file-snapshots! cfg))))))

View File

@@ -8,33 +8,40 @@
"Tokens generation API." "Tokens generation API."
(:require (:require
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.time :as ct] [app.common.time :as ct]
[app.common.transit :as t] [app.common.transit :as t]
[app.setup :as-alias setup]
[buddy.sign.jwe :as jwe])) [buddy.sign.jwe :as jwe]))
(defn generate (defn generate
[{:keys [tokens-key]} claims] [{:keys [::setup/props] :as cfg} claims]
(assert (contains? cfg ::setup/props))
(dm/assert! (let [tokens-key
"expexted token-key to be bytes instance" (get props :tokens-key)
(bytes? tokens-key))
(let [payload (-> claims payload
(assoc :iat (ct/now)) (-> claims
(update :iat (fn [v] (or v (ct/now))))
(d/without-nils) (d/without-nils)
(t/encode))] (t/encode))]
(jwe/encrypt payload tokens-key {:alg :a256kw :enc :a256gcm}))) (jwe/encrypt payload tokens-key {:alg :a256kw :enc :a256gcm})))
(defn decode (defn decode
[{:keys [tokens-key]} token] [{:keys [::setup/props] :as cfg} token]
(let [payload (jwe/decrypt token tokens-key {:alg :a256kw :enc :a256gcm})] (let [tokens-key
(get props :tokens-key)
payload
(jwe/decrypt token tokens-key {:alg :a256kw :enc :a256gcm})]
(t/decode payload))) (t/decode payload)))
(defn verify (defn verify
[sprops {:keys [token] :as params}] [cfg {:keys [token] :as params}]
(let [claims (decode sprops token)] (let [claims (decode cfg token)]
(when (and (ct/inst? (:exp claims)) (when (and (ct/inst? (:exp claims))
(ct/is-before? (:exp claims) (ct/now))) (ct/is-before? (:exp claims) (ct/now)))
(ex/raise :type :validation (ex/raise :type :validation

View File

@@ -27,7 +27,7 @@
(sp/put! channel [type data]) (sp/put! channel [type data])
nil))) nil)))
(defn start-listener (defn spawn-listener
[channel on-event on-close] [channel on-event on-close]
(assert (sp/chan? channel) "expected active events channel") (assert (sp/chan? channel) "expected active events channel")
@@ -51,7 +51,7 @@
[f on-event] [f on-event]
(binding [*channel* (sp/chan :buf 32)] (binding [*channel* (sp/chan :buf 32)]
(let [listener (start-listener *channel* on-event (constantly nil))] (let [listener (spawn-listener *channel* on-event (constantly nil))]
(try (try
(f) (f)
(finally (finally

View File

@@ -7,7 +7,6 @@
(ns app.worker.dispatcher (ns app.worker.dispatcher
(:require (:require
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct] [app.common.time :as ct]
@@ -18,7 +17,9 @@
[app.worker :as-alias wrk] [app.worker :as-alias wrk]
[cuerdas.core :as str] [cuerdas.core :as str]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px])) [promesa.exec :as px])
(:import
java.lang.AutoCloseable))
(set! *warn-on-reflection* true) (set! *warn-on-reflection* true)
@@ -27,7 +28,7 @@
[::wrk/tenant ::sm/text] [::wrk/tenant ::sm/text]
::mtx/metrics ::mtx/metrics
::db/pool ::db/pool
::rds/redis]) ::rds/client])
(defmethod ig/expand-key ::wrk/dispatcher (defmethod ig/expand-key ::wrk/dispatcher
[k v] [k v]
@@ -41,42 +42,107 @@
(assert (sm/check schema:dispatcher cfg))) (assert (sm/check schema:dispatcher cfg)))
(def ^:private sql:select-next-tasks (def ^:private sql:select-next-tasks
"select id, queue from task as t "SELECT id, queue, scheduled_at from task AS t
where t.scheduled_at <= now() WHERE t.scheduled_at <= ?::timestamptz
and (t.status = 'new' or t.status = 'retry') AND (t.status = 'new' OR t.status = 'retry')
and queue ~~* ?::text AND queue ~~* ?::text
order by t.priority desc, t.scheduled_at ORDER BY t.priority DESC, t.scheduled_at
limit ? LIMIT ?
for update skip locked") FOR UPDATE
SKIP LOCKED")
(def ^:private sql:mark-task-scheduled
"UPDATE task SET status = 'scheduled'
WHERE id = ANY(?)")
(def ^:private sql:reschedule-lost
"UPDATE task
SET status='new', scheduled_at=?::timestamptz
FROM (SELECT t.id
FROM task AS t
WHERE status = 'scheduled'
AND (?::timestamptz - t.scheduled_at) > '5 min'::interval) AS subquery
WHERE task.id=subquery.id
RETURNING task.id, task.queue")
(def ^:private sql:clean-orphan
"UPDATE task
SET status='failed', modified_at=?::timestamptz,
error='orphan with running status'
FROM (SELECT t.id
FROM task AS t
WHERE status = 'running'
AND (?::timestamptz - t.modified_at) > '24 hour'::interval) AS subquery
WHERE task.id=subquery.id
RETURNING task.id, task.queue")
(defmethod ig/init-key ::wrk/dispatcher (defmethod ig/init-key ::wrk/dispatcher
[_ {:keys [::db/pool ::rds/redis ::wrk/tenant ::batch-size ::timeout] :as cfg}] [_ {:keys [::db/pool ::wrk/tenant ::batch-size ::timeout] :as cfg}]
(letfn [(get-tasks [conn] (letfn [(reschedule-lost-tasks [{:keys [::db/conn ::timestamp]}]
(let [prefix (str tenant ":%")] (doseq [{:keys [id queue]} (db/exec! conn [sql:reschedule-lost timestamp timestamp]
(seq (db/exec! conn [sql:select-next-tasks prefix batch-size])))) {:return-keys true})]
(l/wrn :hint "reschedule"
:id (str id)
:queue queue)))
(push-tasks! [conn rconn [queue tasks]] (clean-orphan [{:keys [::db/conn ::timestamp]}]
(let [ids (mapv :id tasks) (doseq [{:keys [id queue]} (db/exec! conn [sql:clean-orphan timestamp timestamp]
key (str/ffmt "taskq:%" queue) {:return-keys true})]
res (rds/rpush rconn key (mapv t/encode ids)) (l/wrn :hint "mark as orphan failed"
sql [(str "update task set status = 'scheduled'" :id (str id)
" where id = ANY(?)") :queue queue)))
(get-tasks [{:keys [::db/conn ::timestamp] :as cfg}]
(let [prefix (str tenant ":%")
result (db/exec! conn [sql:select-next-tasks timestamp prefix batch-size])]
(not-empty result)))
(mark-as-scheduled [{:keys [::db/conn]} items]
(let [ids (map :id items)
sql [sql:mark-task-scheduled
(db/create-array conn "uuid" ids)]] (db/create-array conn "uuid" ids)]]
(db/exec-one! conn sql)))
(db/exec-one! conn sql) (push-tasks [{:keys [::rds/conn] :as cfg} [queue tasks]]
(l/trc :hist "enqueue tasks on redis" (let [items (mapv (juxt :id :scheduled-at) tasks)
:queue queue key (str/ffmt "penpot.worker.queue:%" queue)]
:tasks (count ids)
:queued res)))
(run-batch! [rconn] (rds/rpush conn key (mapv t/encode-str items))
(try (mark-as-scheduled cfg tasks)
(db/tx-run! cfg (fn [{:keys [::db/conn]}]
(if-let [tasks (get-tasks conn)] (doseq [{:keys [id queue]} tasks]
(l/trc :hist "schedule"
:id (str id)
:queue queue))))
(run-batch' [cfg]
(let [cfg (assoc cfg ::timestamp (ct/now))]
;; Reschedule lost in transit tasks (can happen when
;; redis server is restarted just after task is pushed)
(reschedule-lost-tasks cfg)
;; Mark as failed all tasks that are still marked as
;; running but it's been more than 24 hours since its
;; last modification
(clean-orphan cfg)
;; Then, schedule the next tasks in queue
(if-let [tasks (get-tasks cfg)]
(->> (group-by :queue tasks) (->> (group-by :queue tasks)
(run! (partial push-tasks! conn rconn))) (run! (partial push-tasks cfg)))
;; FIXME: this sleep should be outside the transaction
(px/sleep (::wait-duration cfg))))) ;; If no tasks found on this batch run, we signal the
;; run-loop to wait for some time before start running
;; the next batch interation
::wait)))
(run-batch []
(let [rconn (rds/connect cfg)]
(try
(-> cfg
(assoc ::rds/conn rconn)
(db/tx-run! run-batch'))
(catch InterruptedException cause (catch InterruptedException cause
(throw cause)) (throw cause))
(catch Exception cause (catch Exception cause
@@ -94,14 +160,18 @@
:else :else
(do (do
(l/err :hint "unhandled exception (will retry in an instant)" :cause cause) (l/err :hint "unhandled exception (will retry in an instant)" :cause cause)
(px/sleep timeout)))))) (px/sleep timeout))))
(finally
(.close ^AutoCloseable rconn)))))
(dispatcher [] (dispatcher []
(l/inf :hint "started") (l/inf :hint "started")
(try (try
(dm/with-open [rconn (rds/connect redis)]
(loop [] (loop []
(run-batch! rconn) (let [result (run-batch)]
(when (= result ::wait)
(px/sleep (::wait-duration cfg)))
(recur))) (recur)))
(catch InterruptedException _ (catch InterruptedException _
(l/trc :hint "interrupted")) (l/trc :hint "interrupted"))
@@ -112,7 +182,7 @@
(if (db/read-only? pool) (if (db/read-only? pool)
(l/wrn :hint "not started (db is read-only)") (l/wrn :hint "not started (db is read-only)")
(px/fn->thread dispatcher :name "penpot/worker/dispatcher" :virtual false)))) (px/fn->thread dispatcher :name "penpot/worker-dispatcher"))))
(defmethod ig/halt-key! ::wrk/dispatcher (defmethod ig/halt-key! ::wrk/dispatcher
[_ thread] [_ thread]

View File

@@ -7,97 +7,83 @@
(ns app.worker.executor (ns app.worker.executor
"Async tasks abstraction (impl)." "Async tasks abstraction (impl)."
(:require (:require
[app.common.data :as d]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.math :as mth]
[app.common.schema :as sm] [app.common.schema :as sm]
[app.common.time :as ct]
[app.metrics :as mtx]
[app.worker :as-alias wrk] [app.worker :as-alias wrk]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px]) [promesa.exec :as px])
(:import (:import
java.util.concurrent.ThreadPoolExecutor)) io.netty.channel.nio.NioEventLoopGroup
io.netty.util.concurrent.DefaultEventExecutorGroup
java.util.concurrent.ExecutorService
java.util.concurrent.ThreadFactory
java.util.concurrent.TimeUnit))
(set! *warn-on-reflection* true) (set! *warn-on-reflection* true)
(sm/register! (sm/register!
{:type ::wrk/executor {:type ::wrk/executor
:pred #(instance? ThreadPoolExecutor %) :pred #(instance? ExecutorService %)
:type-properties :type-properties
{:title "executor" {:title "executor"
:description "Instance of ThreadPoolExecutor"}}) :description "Instance of ExecutorService"}})
(sm/register!
{:type ::wrk/netty-io-executor
:pred #(instance? NioEventLoopGroup %)
:type-properties
{:title "executor"
:description "Instance of NioEventLoopGroup"}})
(sm/register!
{:type ::wrk/netty-executor
:pred #(instance? DefaultEventExecutorGroup %)
:type-properties
{:title "executor"
:description "Instance of DefaultEventExecutorGroup"}})
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; EXECUTOR ;; IO Executor
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;; ;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod ig/init-key ::wrk/executor (defmethod ig/assert-key ::wrk/netty-io-executor
[_ _] [_ {:keys [threads]}]
(let [factory (px/thread-factory :prefix "penpot/default/") (assert (or (nil? threads) (int? threads))
executor (px/cached-executor :factory factory :keepalive 60000)] "expected valid threads value, revisit PENPOT_NETTY_IO_THREADS environment variable"))
(l/inf :hint "executor started")
executor))
(defmethod ig/halt-key! ::wrk/executor (defmethod ig/init-key ::wrk/netty-io-executor
[_ {:keys [threads]}]
(let [factory (px/thread-factory :prefix "penpot/netty-io/")
nthreads (or threads (mth/round (/ (px/get-available-processors) 2)))
nthreads (max 2 nthreads)]
(l/inf :hint "start netty io executor" :threads nthreads)
(NioEventLoopGroup. (int nthreads) ^ThreadFactory factory)))
(defmethod ig/halt-key! ::wrk/netty-io-executor
[_ instance]
(deref (.shutdownGracefully ^NioEventLoopGroup instance
(long 100)
(long 1000)
TimeUnit/MILLISECONDS)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; IO Offload Executor
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defmethod ig/assert-key ::wrk/netty-executor
[_ {:keys [threads]}]
(assert (or (nil? threads) (int? threads))
"expected valid threads value, revisit PENPOT_EXEC_THREADS environment variable"))
(defmethod ig/init-key ::wrk/netty-executor
[_ {:keys [threads]}]
(let [factory (px/thread-factory :prefix "penpot/exec/")
nthreads (or threads (mth/round (/ (px/get-available-processors) 2)))
nthreads (max 2 nthreads)]
(l/inf :hint "start default executor" :threads nthreads)
(DefaultEventExecutorGroup. (int nthreads) ^ThreadFactory factory)))
(defmethod ig/halt-key! ::wrk/netty-executor
[_ instance] [_ instance]
(px/shutdown! instance)) (px/shutdown! instance))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;; MONITOR
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defn- get-stats
[^ThreadPoolExecutor executor]
{:active (.getPoolSize ^ThreadPoolExecutor executor)
:running (.getActiveCount ^ThreadPoolExecutor executor)
:completed (.getCompletedTaskCount ^ThreadPoolExecutor executor)})
(defmethod ig/expand-key ::wrk/monitor
[k v]
{k (-> (d/without-nils v)
(assoc ::interval (ct/duration "2s")))})
(defmethod ig/init-key ::wrk/monitor
[_ {:keys [::wrk/executor ::mtx/metrics ::interval ::wrk/name]}]
(letfn [(monitor! [executor prev-completed]
(let [labels (into-array String [(d/name name)])
stats (get-stats executor)
completed (:completed stats)
completed-inc (- completed prev-completed)
completed-inc (if (neg? completed-inc) 0 completed-inc)]
(mtx/run! metrics
:id :executor-active-threads
:labels labels
:val (:active stats))
(mtx/run! metrics
:id :executor-running-threads
:labels labels
:val (:running stats))
(mtx/run! metrics
:id :executors-completed-tasks
:labels labels
:inc completed-inc)
completed-inc))]
(px/thread
{:name "penpot/executors-monitor" :virtual true}
(l/inf :hint "monitor started" :name name)
(try
(loop [completed 0]
(px/sleep interval)
(recur (long (monitor! executor completed))))
(catch InterruptedException _cause
(l/trc :hint "monitor: interrupted" :name name))
(catch Throwable cause
(l/err :hint "monitor: unexpected error" :name name :cause cause))
(finally
(l/inf :hint "monitor: terminated" :name name))))))
(defmethod ig/halt-key! ::wrk/monitor
[_ thread]
(px/interrupt! thread))

View File

@@ -8,7 +8,6 @@
"Async tasks abstraction (impl)." "Async tasks abstraction (impl)."
(:require (:require
[app.common.data :as d] [app.common.data :as d]
[app.common.data.macros :as dm]
[app.common.exceptions :as ex] [app.common.exceptions :as ex]
[app.common.logging :as l] [app.common.logging :as l]
[app.common.schema :as sm] [app.common.schema :as sm]
@@ -20,7 +19,9 @@
[app.worker :as wrk] [app.worker :as wrk]
[cuerdas.core :as str] [cuerdas.core :as str]
[integrant.core :as ig] [integrant.core :as ig]
[promesa.exec :as px])) [promesa.exec :as px])
(:import
java.lang.AutoCloseable))
(set! *warn-on-reflection* true) (set! *warn-on-reflection* true)
@@ -37,7 +38,7 @@
[:max-retries :int] [:max-retries :int]
[:retry-num :int] [:retry-num :int]
[:priority :int] [:priority :int]
[:status [:enum "scheduled" "completed" "new" "retry" "failed"]] [:status [:enum "scheduled" "running" "completed" "new" "retry" "failed"]]
[:label {:optional true} :string] [:label {:optional true} :string]
[:props :map]]) [:props :map]])
@@ -68,7 +69,7 @@
(decode-task-row)))) (decode-task-row))))
(defn- run-task (defn- run-task
[{:keys [::wrk/registry ::id ::queue] :as cfg} task] [{:keys [::db/pool ::wrk/registry ::id ::queue] :as cfg} task]
(try (try
(l/dbg :hint "start" (l/dbg :hint "start"
:name (:name task) :name (:name task)
@@ -76,6 +77,14 @@
:queue queue :queue queue
:runner-id id :runner-id id
:retry (:retry-num task)) :retry (:retry-num task))
;; Mark task as running
(db/update! pool :task
{:status "running"
:modified-at (ct/now)}
{:id (:id task)}
{::db/return-keys false})
(let [tpoint (ct/tpoint) (let [tpoint (ct/tpoint)
task-fn (wrk/get-task registry (:name task)) task-fn (wrk/get-task registry (:name task))
result (when task-fn (task-fn task)) result (when task-fn (task-fn task))
@@ -119,7 +128,7 @@
{:status "retry" :error cause}))))))) {:status "retry" :error cause})))))))
(defn- run-task! (defn- run-task!
[{:keys [::id ::timeout] :as cfg} task-id] [{:keys [::id ::timeout] :as cfg} task-id scheduled-at]
(loop [task (get-task cfg task-id)] (loop [task (get-task cfg task-id)]
(cond (cond
(ex/exception? task) (ex/exception? task)
@@ -127,20 +136,26 @@
(db/serialization-error? task)) (db/serialization-error? task))
(do (do
(l/wrn :hint "connection error on retrieving task from database (retrying in some instants)" (l/wrn :hint "connection error on retrieving task from database (retrying in some instants)"
:id id :runner-id id
:cause task) :cause task)
(px/sleep timeout) (px/sleep timeout)
(recur (get-task cfg task-id))) (recur (get-task cfg task-id)))
(do (do
(l/err :hint "unhandled exception on retrieving task from database (retrying in some instants)" (l/err :hint "unhandled exception on retrieving task from database (retrying in some instants)"
:id id :runner-id id
:cause task) :cause task)
(px/sleep timeout) (px/sleep timeout)
(recur (get-task cfg task-id)))) (recur (get-task cfg task-id))))
(not= (inst-ms scheduled-at)
(inst-ms (:scheduled-at task)))
(l/wrn :hint "skiping task, rescheduled"
:task-id task-id
:runner-id id)
(nil? task) (nil? task)
(l/wrn :hint "no task found on the database" (l/wrn :hint "no task found on the database"
:id id :runner-id id
:task-id task-id) :task-id task-id)
:else :else
@@ -149,7 +164,7 @@
{::task task}))))) {::task task})))))
(defn- run-worker-loop! (defn- run-worker-loop!
[{:keys [::db/pool ::rds/rconn ::timeout ::queue] :as cfg}] [{:keys [::db/pool ::rds/conn ::timeout ::queue] :as cfg}]
(letfn [(handle-task-retry [{:keys [error inc-by delay] :or {inc-by 1 delay 1000} :as result}] (letfn [(handle-task-retry [{:keys [error inc-by delay] :or {inc-by 1 delay 1000} :as result}]
(let [explain (if (ex/exception? error) (let [explain (if (ex/exception? error)
(ex-message error) (ex-message error)
@@ -183,21 +198,23 @@
(db/update! pool :task (db/update! pool :task
{:completed-at now {:completed-at now
:modified-at now :modified-at now
:error nil
:status "completed"} :status "completed"}
{:id (:id task)}) {:id (:id task)})
nil)) nil))
(decode-payload [^bytes payload] (decode-payload [payload]
(try (try
(let [task-id (t/decode payload)] (let [[task-id scheduled-at :as payload] (t/decode-str payload)]
(if (uuid? task-id) (if (and (uuid? task-id)
task-id (ct/inst? scheduled-at))
(l/err :hint "received unexpected payload (uuid expected)" payload
:payload task-id))) (l/err :hint "received unexpected payload"
:payload payload)))
(catch Throwable cause (catch Throwable cause
(l/err :hint "unable to decode payload" (l/err :hint "unable to decode payload"
:payload payload :payload payload
:length (alength payload) :length (alength ^String/1 payload)
:cause cause)))) :cause cause))))
(process-result [{:keys [status] :as result}] (process-result [{:keys [status] :as result}]
@@ -209,8 +226,8 @@
(throw (IllegalArgumentException. (throw (IllegalArgumentException.
(str "invalid status received: " status)))))) (str "invalid status received: " status))))))
(run-task-loop [task-id] (run-task-loop [[task-id scheduled-at]]
(loop [result (run-task! cfg task-id)] (loop [result (run-task! cfg task-id scheduled-at)]
(when-let [cause (process-result result)] (when-let [cause (process-result result)]
(if (or (db/connection-error? cause) (if (or (db/connection-error? cause)
(db/serialization-error? cause)) (db/serialization-error? cause))
@@ -220,14 +237,12 @@
(px/sleep timeout) (px/sleep timeout)
(recur result)) (recur result))
(do (do
(l/err :hint "unhandled exception on processing task result (retrying in some instants)" (l/err :hint "unhandled exception on processing task result"
:cause cause) :cause cause))))))]
(px/sleep timeout)
(recur result))))))]
(try (try
(let [key (str/ffmt "taskq:%" queue) (let [key (str/ffmt "penpot.worker.queue:%" queue)
[_ payload] (rds/blpop rconn timeout [key])] [_ payload] (rds/blpop conn [key] timeout)]
(some-> payload (some-> payload
decode-payload decode-payload
run-task-loop)) run-task-loop))
@@ -246,22 +261,22 @@
(l/err :hint "unhandled exception" :cause cause)))))) (l/err :hint "unhandled exception" :cause cause))))))
(defn- start-thread! (defn- start-thread!
[{:keys [::rds/redis ::id ::queue ::wrk/tenant] :as cfg}] [{:keys [::id ::queue ::wrk/tenant] :as cfg}]
(px/thread (px/thread
{:name (format "penpot/worker/runner:%s" id)} {:name (str "penpot/job-runner/" id)}
(l/inf :hint "started" :id id :queue queue) (l/inf :hint "started" :id id :queue queue)
(let [rconn (rds/connect cfg)]
(try (try
(dm/with-open [rconn (rds/connect redis)] (loop [cfg (-> cfg
(let [cfg (-> cfg (assoc ::rds/conn rconn)
(assoc ::rds/rconn rconn)
(assoc ::queue (str/ffmt "%:%" tenant queue)) (assoc ::queue (str/ffmt "%:%" tenant queue))
(assoc ::timeout (ct/duration "5s")))] (assoc ::timeout (ct/duration "5s")))]
(loop []
(when (px/interrupted?) (when (px/interrupted?)
(throw (InterruptedException. "interrupted"))) (throw (InterruptedException. "interrupted")))
(run-worker-loop! cfg) (run-worker-loop! cfg)
(recur)))) (recur cfg))
(catch InterruptedException _ (catch InterruptedException _
(l/dbg :hint "interrupted" (l/dbg :hint "interrupted"
@@ -273,9 +288,10 @@
:queue queue :queue queue
:cause cause)) :cause cause))
(finally (finally
(.close ^AutoCloseable rconn)
(l/inf :hint "terminated" (l/inf :hint "terminated"
:id id :id id
:queue queue))))) :queue queue))))))
(def ^:private schema:params (def ^:private schema:params
[:map [:map
@@ -285,7 +301,7 @@
::wrk/registry ::wrk/registry
::mtx/metrics ::mtx/metrics
::db/pool ::db/pool
::rds/redis]) ::rds/client])
(defmethod ig/assert-key ::wrk/runner (defmethod ig/assert-key ::wrk/runner
[_ params] [_ params]
@@ -303,7 +319,7 @@
(l/wrn :hint "not started (db is read-only)" :queue queue :parallelism parallelism) (l/wrn :hint "not started (db is read-only)" :queue queue :parallelism parallelism)
(doall (doall
(->> (range parallelism) (->> (range parallelism)
(map #(assoc cfg ::id %)) (map #(assoc cfg ::id (str queue "/" %)))
(map start-thread!)))))) (map start-thread!))))))
(defmethod ig/halt-key! ::wrk/runner (defmethod ig/halt-key! ::wrk/runner

View File

@@ -4,7 +4,7 @@
penpot/path-data app.common.types.path/from-string penpot/path-data app.common.types.path/from-string
penpot/matrix app.common.geom.matrix/decode-matrix penpot/matrix app.common.geom.matrix/decode-matrix
penpot/point app.common.geom.point/decode-point penpot/point app.common.geom.point/decode-point
penpot/token-lib app.common.types.tokens-lib/parse-multi-set-dtcg-json penpot/tokens-lib app.common.types.tokens-lib/parse-multi-set-dtcg-json
penpot/token-set app.common.types.tokens-lib/make-token-set penpot/token-set app.common.types.tokens-lib/make-token-set
penpot/token-theme app.common.types.tokens-lib/make-token-theme penpot/token-theme app.common.types.tokens-lib/make-token-theme
penpot/token app.common.types.tokens-lib/make-token} penpot/token app.common.types.tokens-lib/make-token}

View File

@@ -101,12 +101,10 @@
(t/deftest test-parse-bounce-report (t/deftest test-parse-bounce-report
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (:app.setup/props th/*system*) report (bounce-report {:token (tokens/generate th/*system*
cfg {:app.setup/props props}
report (bounce-report {:token (tokens/generate props
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile)})}) :profile-id (:id profile)})})
result (#'awsns/parse-notification cfg report)] result (#'awsns/parse-notification th/*system* report)]
;; (pprint result) ;; (pprint result)
(t/is (= "bounce" (:type result))) (t/is (= "bounce" (:type result)))
@@ -117,12 +115,10 @@
(t/deftest test-parse-complaint-report (t/deftest test-parse-complaint-report
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (:app.setup/props th/*system*) report (complaint-report {:token (tokens/generate th/*system*
cfg {:app.setup/props props}
report (complaint-report {:token (tokens/generate props
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile)})}) :profile-id (:id profile)})})
result (#'awsns/parse-notification cfg report)] result (#'awsns/parse-notification th/*system* report)]
;; (pprint result) ;; (pprint result)
(t/is (= "complaint" (:type result))) (t/is (= "complaint" (:type result)))
(t/is (= "abuse" (:kind result))) (t/is (= "abuse" (:kind result)))
@@ -143,15 +139,13 @@
(t/deftest test-process-bounce-report (t/deftest test-process-bounce-report
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (:app.setup/props th/*system*)
pool (:app.db/pool th/*system*) pool (:app.db/pool th/*system*)
cfg {:app.setup/props props :app.db/pool pool} report (bounce-report {:token (tokens/generate th/*system*
report (bounce-report {:token (tokens/generate props
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile)})}) :profile-id (:id profile)})})
report (#'awsns/parse-notification cfg report)] report (#'awsns/parse-notification th/*system* report)]
(#'awsns/process-report cfg report) (#'awsns/process-report th/*system* report)
(let [rows (->> (db/query pool :profile-complaint-report {:profile-id (:id profile)}) (let [rows (->> (db/query pool :profile-complaint-report {:profile-id (:id profile)})
(mapv decode-row))] (mapv decode-row))]
@@ -170,16 +164,13 @@
(t/deftest test-process-complaint-report (t/deftest test-process-complaint-report
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (:app.setup/props th/*system*)
pool (:app.db/pool th/*system*) pool (:app.db/pool th/*system*)
cfg {:app.setup/props props report (complaint-report {:token (tokens/generate th/*system*
:app.db/pool pool}
report (complaint-report {:token (tokens/generate props
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile)})}) :profile-id (:id profile)})})
report (#'awsns/parse-notification cfg report)] report (#'awsns/parse-notification th/*system* report)]
(#'awsns/process-report cfg report) (#'awsns/process-report th/*system* report)
(let [rows (->> (db/query pool :profile-complaint-report {:profile-id (:id profile)}) (let [rows (->> (db/query pool :profile-complaint-report {:profile-id (:id profile)})
(mapv decode-row))] (mapv decode-row))]
@@ -200,16 +191,14 @@
(t/deftest test-process-bounce-report-to-self (t/deftest test-process-bounce-report-to-self
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (:app.setup/props th/*system*)
pool (:app.db/pool th/*system*) pool (:app.db/pool th/*system*)
cfg {:app.setup/props props :app.db/pool pool}
report (bounce-report {:email (:email profile) report (bounce-report {:email (:email profile)
:token (tokens/generate props :token (tokens/generate th/*system*
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile)})}) :profile-id (:id profile)})})
report (#'awsns/parse-notification cfg report)] report (#'awsns/parse-notification th/*system* report)]
(#'awsns/process-report cfg report) (#'awsns/process-report th/*system* report)
(let [rows (db/query pool :profile-complaint-report {:profile-id (:id profile)})] (let [rows (db/query pool :profile-complaint-report {:profile-id (:id profile)})]
(t/is (= 1 (count rows)))) (t/is (= 1 (count rows))))
@@ -222,16 +211,14 @@
(t/deftest test-process-complaint-report-to-self (t/deftest test-process-complaint-report-to-self
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (:app.setup/props th/*system*)
pool (:app.db/pool th/*system*) pool (:app.db/pool th/*system*)
cfg {:app.setup/props props :app.db/pool pool}
report (complaint-report {:email (:email profile) report (complaint-report {:email (:email profile)
:token (tokens/generate props :token (tokens/generate th/*system*
{:iss :profile-identity {:iss :profile-identity
:profile-id (:id profile)})}) :profile-id (:id profile)})})
report (#'awsns/parse-notification cfg report)] report (#'awsns/parse-notification th/*system* report)]
(#'awsns/process-report cfg report) (#'awsns/process-report th/*system* report)
(let [rows (db/query pool :profile-complaint-report {:profile-id (:id profile)})] (let [rows (db/query pool :profile-complaint-report {:profile-id (:id profile)})]
(t/is (= 1 (count rows)))) (t/is (= 1 (count rows))))

View File

@@ -62,7 +62,8 @@
(def default (def default
{:database-uri "postgresql://postgres/penpot_test" {:database-uri "postgresql://postgres/penpot_test"
:redis-uri "redis://redis/1" :redis-uri "redis://redis/1"
:auto-file-snapshot-every 1}) :auto-file-snapshot-every 1
:file-data-backend "db"})
(def config (def config
(cf/read-config :prefix "penpot-test" (cf/read-config :prefix "penpot-test"
@@ -74,9 +75,6 @@
:enable-smtp :enable-smtp
:enable-quotes :enable-quotes
:enable-rpc-climit :enable-rpc-climit
:enable-feature-fdata-pointer-map
:enable-feature-fdata-objets-map
:enable-feature-components-v2
:enable-auto-file-snapshot :enable-auto-file-snapshot
:disable-file-validation]) :disable-file-validation])
@@ -99,7 +97,7 @@
:thumbnail-uri "test" :thumbnail-uri "test"
:path (-> "backend_tests/test_files/template.penpot" io/resource fs/path)}] :path (-> "backend_tests/test_files/template.penpot" io/resource fs/path)}]
system (-> (merge main/system-config main/worker-config) system (-> (merge main/system-config main/worker-config)
(assoc-in [:app.redis/redis :app.redis/uri] (:redis-uri config)) (assoc-in [:app.redis/client :app.redis/uri] (:redis-uri config))
(assoc-in [::db/pool ::db/uri] (:database-uri config)) (assoc-in [::db/pool ::db/uri] (:database-uri config))
(assoc-in [::db/pool ::db/username] (:database-username config)) (assoc-in [::db/pool ::db/username] (:database-username config))
(assoc-in [::db/pool ::db/password] (:database-password config)) (assoc-in [::db/pool ::db/password] (:database-password config))
@@ -113,7 +111,6 @@
:app.auth.oidc.providers/generic :app.auth.oidc.providers/generic
:app.setup/templates :app.setup/templates
:app.auth.oidc/routes :app.auth.oidc/routes
:app.worker/monitor
:app.http.oauth/handler :app.http.oauth/handler
:app.notifications/handler :app.notifications/handler
:app.loggers.mattermost/reporter :app.loggers.mattermost/reporter

View File

@@ -25,8 +25,7 @@
(t/deftest authenticate-method (t/deftest authenticate-method
(let [profile (th/create-profile* 1) (let [profile (th/create-profile* 1)
props (get th/*system* :app.setup/props) token (#'sess/gen-token th/*system* {:profile-id (:id profile)})
token (#'sess/gen-token props {:profile-id (:id profile)})
request {:params {:token token}} request {:params {:token token}}
response (#'mgmt/authenticate th/*system* request)] response (#'mgmt/authenticate th/*system* request)]

View File

@@ -0,0 +1,59 @@
;; This Source Code Form is subject to the terms of the Mozilla Public
;; License, v. 2.0. If a copy of the MPL was not distributed with this
;; file, You can obtain one at http://mozilla.org/MPL/2.0/.
;;
;; Copyright (c) KALEIDOS INC
(ns backend-tests.http-middleware-security
(:require
[app.http.security :as sec]
[clojure.test :as t]
[yetti.request :as yreq]
[yetti.response :as yres]))
(defn- mock-request
[method value]
(reify yreq/IRequest
(method [_]
method)
(get-header [_ _]
value)))
(t/deftest sec-fetch-metadata
(let [request1 (mock-request :get "same-origin")
request2 (mock-request :post "same-origin")
request3 (mock-request :get "same-site")
request4 (mock-request :post "same-site")
request5 (mock-request :get "cross-site")
request6 (mock-request :post "cross-site")
handler (fn [request]
{::yres/status 200})
handler (#'sec/wrap-sec-fetch-metadata handler)
resp1 (handler request1)
resp2 (handler request2)
resp3 (handler request3)
resp4 (handler request4)
resp5 (handler request5)
resp6 (handler request6)]
(t/is (= 200 (::yres/status resp1)))
(t/is (= 200 (::yres/status resp2)))
(t/is (= 200 (::yres/status resp3)))
(t/is (= 403 (::yres/status resp4)))
(t/is (= 200 (::yres/status resp5)))
(t/is (= 403 (::yres/status resp6)))))
(t/deftest client-header-check
(let [request1 (mock-request :get "some")
request2 (mock-request :post nil)
handler (fn [request]
{::yres/status 200})
handler (#'sec/wrap-client-header-check handler)
resp1 (handler request1)
resp2 (handler request2)]
(t/is (= 200 (::yres/status resp1)))
(t/is (= 403 (::yres/status resp2)))))

View File

@@ -144,7 +144,6 @@
(t/is (not= (:modified-at comment) (:modified-at comment'))) (t/is (not= (:modified-at comment) (:modified-at comment')))
(t/is (= (:content data) (:content comment')))))) (t/is (= (:content data) (:content comment'))))))
(t/testing "retrieve threads" (t/testing "retrieve threads"
(let [data {::th/type :get-comment-threads (let [data {::th/type :get-comment-threads
::rpc/profile-id (:id profile-1) ::rpc/profile-id (:id profile-1)

View File

@@ -29,7 +29,7 @@
true true
(catch Throwable _cause (catch Throwable _cause
false))) false)))
{:num 30})) {:num 15}))

View File

@@ -8,10 +8,10 @@
(:require (:require
[app.common.features :as cfeat] [app.common.features :as cfeat]
[app.common.pprint :as pp] [app.common.pprint :as pp]
[app.common.pprint :as pp]
[app.common.thumbnails :as thc] [app.common.thumbnails :as thc]
[app.common.types.shape :as cts] [app.common.types.shape :as cts]
[app.common.uuid :as uuid] [app.common.uuid :as uuid]
[app.config :as cf]
[app.db :as db] [app.db :as db]
[app.db.sql :as sql] [app.db.sql :as sql]
[app.http :as http] [app.http :as http]
@@ -87,10 +87,7 @@
;; (th/print-result! out) ;; (th/print-result! out)
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(t/is (nil? (:result out))))
(let [result (:result out)]
(t/is (= "label1" (:label result)))
(t/is (uuid? (:id result)))))
(let [[row1 row2 :as rows] (let [[row1 row2 :as rows]
(th/db-query :file-change (th/db-query :file-change
@@ -116,7 +113,7 @@
;; (th/print-result! out) ;; (th/print-result! out)
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(t/is (nil? (:result out))))) (t/is (true? (:result out)))))
(t/testing "delete system created snapshot" (t/testing "delete system created snapshot"
(let [params {::th/type :delete-file-snapshot (let [params {::th/type :delete-file-snapshot
@@ -130,7 +127,14 @@
data (ex-data error)] data (ex-data error)]
(t/is (th/ex-info? error)) (t/is (th/ex-info? error))
(t/is (= (:type data) :validation)) (t/is (= (:type data) :validation))
(t/is (= (:code data) :system-snapshots-cant-be-deleted))))))))) (t/is (= (:code data) :system-snapshots-cant-be-deleted)))))
;; this will run pending task triggered by deleting user snapshot
(th/run-pending-tasks!)
(let [res (th/run-task! :objects-gc {:deletion-threshold (cf/get-deletion-delay)})]
;; delete 2 snapshots and 2 file data entries
(t/is (= 4 (:processed res))))))))
(t/deftest snapshots-locking (t/deftest snapshots-locking
(let [profile-1 (th/create-profile* 1 {:is-active true}) (let [profile-1 (th/create-profile* 1 {:is-active true})
@@ -172,9 +176,9 @@
out (th/command! params)] out (th/command! params)]
;; (th/print-result! out) ;; (th/print-result! out)
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(t/is (nil? (:result out))) (t/is (true? (:result out)))
(let [snapshot (th/db-get :file-change {:id (:id snapshot)})] (let [snapshot (th/db-get :file-change {:id (:id snapshot)} {::db/remove-deleted false})]
(t/is (= (:id profile-1) (:locked-by snapshot)))))) (t/is (= (:id profile-1) (:locked-by snapshot))))))
(t/testing "delete locked snapshot" (t/testing "delete locked snapshot"
@@ -199,7 +203,7 @@
out (th/command! params)] out (th/command! params)]
;; (th/print-result! out) ;; (th/print-result! out)
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(t/is (nil? (:result out))) (t/is (true? (:result out)))
(let [snapshot (th/db-get :file-change {:id (:id snapshot)})] (let [snapshot (th/db-get :file-change {:id (:id snapshot)})]
(t/is (= nil (:locked-by snapshot)))))) (t/is (= nil (:locked-by snapshot))))))
@@ -213,4 +217,4 @@
;; (th/print-result! out) ;; (th/print-result! out)
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(t/is (nil? (:result out))))))) (t/is (true? (:result out)))))))

View File

@@ -15,6 +15,7 @@
[app.config :as cf] [app.config :as cf]
[app.db :as db] [app.db :as db]
[app.db.sql :as sql] [app.db.sql :as sql]
[app.features.fdata :as fdata]
[app.http :as http] [app.http :as http]
[app.rpc :as-alias rpc] [app.rpc :as-alias rpc]
[app.rpc.commands.files :as files] [app.rpc.commands.files :as files]
@@ -116,29 +117,8 @@
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(t/is (nil? (:result out))))) (t/is (nil? (:result out)))))
(t/testing "query single file after delete" (t/testing "query single file after delete"
(let [data {::th/type :get-file
::rpc/profile-id (:id prof)
:id file-id
:components-v2 true}
out (th/command! data)]
;; (th/print-result! out)
(t/is (nil? (:error out)))
(let [result (:result out)]
(t/is (some? (:deleted-at result)))
(t/is (= file-id (:id result)))
(t/is (= "new name" (:name result)))
(t/is (= 1 (count (get-in result [:data :pages]))))
(t/is (nil? (:users result))))))
(th/db-update! :file
{:deleted-at (ct/now)}
{:id file-id})
(t/testing "query single file after delete and wait"
(let [data {::th/type :get-file (let [data {::th/type :get-file
::rpc/profile-id (:id prof) ::rpc/profile-id (:id prof)
:id file-id :id file-id
@@ -185,10 +165,10 @@
shape-id (uuid/random)] shape-id (uuid/random)]
;; Preventive file-gc ;; Preventive file-gc
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file) :revn (:revn file)})))
;; Check the number of fragments before adding the page ;; Check the number of fragments before adding the page
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 2 (count rows)))) (t/is (= 2 (count rows))))
;; Add page ;; Add page
@@ -203,22 +183,23 @@
:id page-id}]) :id page-id}])
;; Check the number of fragments before adding the page ;; Check the number of fragments before adding the page
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 3 (count rows)))) (t/is (= 3 (count rows))))
;; The file-gc should mark for remove unused fragments ;; The file-gc should mark for remove unused fragments
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; Check the number of fragments ;; Check the number of fragments
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 5 (count rows)))) (t/is (= 5 (count rows)))
(t/is (= 3 (count (filterv :deleted-at rows)))))
;; The objects-gc should remove unused fragments ;; The objects-gc should remove unused fragments
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 3 (:processed res)))) (t/is (= 3 (:processed res))))
;; Check the number of fragments ;; Check the number of fragments
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 2 (count rows)))) (t/is (= 2 (count rows))))
;; Add shape to page that should add a new fragment ;; Add shape to page that should add a new fragment
@@ -242,44 +223,47 @@
:type :rect})}]) :type :rect})}])
;; Check the number of fragments ;; Check the number of fragments
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 3 (count rows)))) (t/is (= 3 (count rows))))
;; The file-gc should mark for remove unused fragments ;; The file-gc should mark for remove unused fragments
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; The objects-gc should remove unused fragments ;; The objects-gc should remove unused fragments
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 3 (:processed res)))) (t/is (= 3 (:processed res))))
;; Check the number of fragments; ;; Check the number of fragments;
(let [rows (th/db-query :file-data-fragment {:file-id (:id file) (let [rows (th/db-query :file-data {:file-id (:id file)
:type "fragment"
:deleted-at nil})] :deleted-at nil})]
(t/is (= 2 (count rows)))) (t/is (= 2 (count rows))))
;; Lets proceed to delete all changes ;; Lets proceed to delete all changes
(th/db-delete! :file-change {:file-id (:id file)}) (th/db-delete! :file-change {:file-id (:id file)})
(th/db-delete! :file-data {:file-id (:id file) :type "snapshot"})
(th/db-update! :file (th/db-update! :file
{:has-media-trimmed false} {:has-media-trimmed false}
{:id (:id file)}) {:id (:id file)})
;; The file-gc should remove fragments related to changes ;; The file-gc should remove fragments related to changes
;; snapshots previously deleted. ;; snapshots previously deleted.
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; Check the number of fragments; ;; Check the number of fragments;
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
;; (pp/pprint rows) ;; (pp/pprint rows)
(t/is (= 4 (count rows))) (t/is (= 4 (count rows)))
(t/is (= 2 (count (remove (comp some? :deleted-at) rows))))) (t/is (= 2 (count (remove :deleted-at rows)))))
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 2 (:processed res)))) (t/is (= 2 (:processed res))))
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 2 (count rows))))))) (t/is (= 2 (count rows)))))))
(t/deftest file-gc-task-with-thumbnails (t/deftest file-gc-with-thumbnails
(letfn [(add-file-media-object [& {:keys [profile-id file-id]}] (letfn [(add-file-media-object [& {:keys [profile-id file-id]}]
(let [mfile {:filename "sample.jpg" (let [mfile {:filename "sample.jpg"
:path (th/tempfile "backend_tests/test_files/sample.jpg") :path (th/tempfile "backend_tests/test_files/sample.jpg")
@@ -347,7 +331,7 @@
:fills [{:fill-opacity 1 :fills [{:fill-opacity 1
:fill-image {:id (:id fmo1) :width 100 :height 100 :mtype "image/jpeg"}}]})}]) :fill-image {:id (:id fmo1) :width 100 :height 100 :mtype "image/jpeg"}}]})}])
;; Check that reference storage objects on filemediaobjects ;; Check that reference storage objects on file_media_objects
;; are the same because of deduplication feature. ;; are the same because of deduplication feature.
(t/is (= (:media-id fmo1) (:media-id fmo2))) (t/is (= (:media-id fmo1) (:media-id fmo2)))
(t/is (= (:thumbnail-id fmo1) (:thumbnail-id fmo2))) (t/is (= (:thumbnail-id fmo1) (:thumbnail-id fmo2)))
@@ -360,32 +344,33 @@
(t/is (= 2 (:freeze res))) (t/is (= 2 (:freeze res)))
(t/is (= 0 (:delete res)))) (t/is (= 0 (:delete res))))
;; run the file-gc task immediately without forced min-age
(t/is (false? (th/run-task! :file-gc {:file-id (:id file)})))
;; run the task again ;; run the task again
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; retrieve file and check trimmed attribute ;; retrieve file and check trimmed attribute
(let [row (th/db-get :file {:id (:id file)})] (let [row (th/db-get :file {:id (:id file)})]
(t/is (true? (:has-media-trimmed row)))) (t/is (true? (:has-media-trimmed row))))
;; check file media objects ;; check file media objects
(let [rows (th/db-query :file-media-object {:file-id (:id file)})] (let [[row1 row2 :as rows]
(t/is (= 2 (count rows))) (th/db-query :file-media-object
(t/is (= 1 (count (remove (comp some? :deleted-at) rows))))) {:file-id (:id file)}
{:order-by [:created-at]})]
(t/is (= (:id fmo1) (:id row1)))
(t/is (= (:id fmo2) (:id row2)))
(t/is (ct/inst? (:deleted-at row2))))
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
;; delete 2 fragments and 1 media object
(t/is (= 3 (:processed res)))) (t/is (= 3 (:processed res))))
;; check file media objects ;; check file media objects
(let [rows (th/db-query :file-media-object {:file-id (:id file)})] (let [rows (th/db-query :file-media-object {:file-id (:id file)})]
(t/is (= 1 (count rows))) (t/is (= 1 (count rows)))
(t/is (= 1 (count (remove (comp some? :deleted-at) rows))))) (t/is (= 1 (count (remove :deleted-at rows)))))
;; The underlying storage objects are still available. ;; The underlying storage objects are still available.
(t/is (some? (sto/get-object storage (:media-id fmo2))))
(t/is (some? (sto/get-object storage (:thumbnail-id fmo2))))
(t/is (some? (sto/get-object storage (:media-id fmo1)))) (t/is (some? (sto/get-object storage (:media-id fmo1))))
(t/is (some? (sto/get-object storage (:thumbnail-id fmo1)))) (t/is (some? (sto/get-object storage (:thumbnail-id fmo1))))
@@ -402,34 +387,40 @@
;; Now, we have deleted the usage of pointers to the ;; Now, we have deleted the usage of pointers to the
;; file-media-objects, if we paste file-gc, they should be marked ;; file-media-objects, if we paste file-gc, they should be marked
;; as deleted. ;; as deleted.
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; This only clears fragments, the file media objects still referenced because ;; This only clears fragments, the file media objects still referenced because
;; snapshots are preserved ;; snapshots are preserved
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 2 (:processed res)))) (t/is (= 2 (:processed res))))
;; Mark all snapshots to be a non-snapshot file change ;; Delete all snapshots
(th/db-exec! ["update file_change set data = null where file_id = ?" (:id file)]) (th/db-exec! ["update file_data set deleted_at = now() where file_id = ? and type = 'snapshot'" (:id file)])
(th/db-exec! ["update file_change set deleted_at = now() where file_id = ? and label is not null" (:id file)])
(th/db-exec! ["update file set has_media_trimmed = false where id = ?" (:id file)]) (th/db-exec! ["update file set has_media_trimmed = false where id = ?" (:id file)])
(let [res (th/run-task! :objects-gc {:deletion-threshold 0})]
;; this will remove the file change and file data entries for two snapshots
(t/is (= 4 (:processed res))))
;; Rerun the file-gc and objects-gc ;; Rerun the file-gc and objects-gc
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
(let [res (th/run-task! :objects-gc {})]
(let [res (th/run-task! :objects-gc {:deletion-threshold 0})]
;; this will remove the file media objects marked as deleted
;; on prev file-gc
(t/is (= 2 (:processed res)))) (t/is (= 2 (:processed res))))
;; Now that file-gc have deleted the file-media-object usage, ;; Now that file-gc have deleted the file-media-object usage,
;; lets execute the touched-gc task, we should see that two of ;; lets execute the touched-gc task, we should see that two of
;; them are marked to be deleted. ;; them are marked to be deleted
(let [res (th/run-task! :storage-gc-touched {:min-age 0})] (let [res (th/run-task! :storage-gc-touched {:min-age 0})]
(t/is (= 0 (:freeze res))) (t/is (= 0 (:freeze res)))
(t/is (= 2 (:delete res)))) (t/is (= 2 (:delete res))))
;; Finally, check that some of the objects that are marked as ;; Finally, check that some of the objects that are marked as
;; deleted we are unable to retrieve them using standard storage ;; deleted we are unable to retrieve them using standard storage
;; public api. ;; public api
(t/is (nil? (sto/get-object storage (:media-id fmo2))))
(t/is (nil? (sto/get-object storage (:thumbnail-id fmo2))))
(t/is (nil? (sto/get-object storage (:media-id fmo1)))) (t/is (nil? (sto/get-object storage (:media-id fmo1))))
(t/is (nil? (sto/get-object storage (:thumbnail-id fmo1))))))) (t/is (nil? (sto/get-object storage (:thumbnail-id fmo1)))))))
@@ -470,7 +461,8 @@
page-id (first (get-in file [:data :pages]))] page-id (first (get-in file [:data :pages]))]
(let [rows (th/db-query :file-data-fragment {:file-id (:id file) (let [rows (th/db-query :file-data {:file-id (:id file)
:type "fragment"
:deleted-at nil})] :deleted-at nil})]
(t/is (= (count rows) 1))) (t/is (= (count rows) 1)))
@@ -536,16 +528,14 @@
:strokes [{:stroke-opacity 1 :stroke-image {:id (:id fmo5) :width 100 :height 100 :mtype "image/jpeg"}}]})}]) :strokes [{:stroke-opacity 1 :stroke-image {:id (:id fmo5) :width 100 :height 100 :mtype "image/jpeg"}}]})}])
;; run the file-gc task immediately without forced min-age
(t/is (false? (th/run-task! :file-gc {:file-id (:id file)})))
;; run the task again ;; run the task again
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 2 (:processed res)))) (t/is (= 2 (:processed res))))
(let [rows (th/db-query :file-data-fragment {:file-id (:id file) (let [rows (th/db-query :file-data {:file-id (:id file)
:type "fragment"
:deleted-at nil})] :deleted-at nil})]
(t/is (= (count rows) 1))) (t/is (= (count rows) 1)))
@@ -583,7 +573,7 @@
;; Now, we have deleted the usage of pointers to the ;; Now, we have deleted the usage of pointers to the
;; file-media-objects, if we paste file-gc, they should be marked ;; file-media-objects, if we paste file-gc, they should be marked
;; as deleted. ;; as deleted.
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; This only removes unused fragments, file media are still ;; This only removes unused fragments, file media are still
;; referenced on snapshots. ;; referenced on snapshots.
@@ -592,15 +582,17 @@
;; Mark all snapshots to be a non-snapshot file change ;; Mark all snapshots to be a non-snapshot file change
(th/db-exec! ["update file set has_media_trimmed = false where id = ?" (:id file)]) (th/db-exec! ["update file set has_media_trimmed = false where id = ?" (:id file)])
(th/db-exec! ["update file_change set data = null where file_id = ?" (:id file)]) (th/db-delete! :file-data {:file-id (:id file)
:type "snapshot"})
;; Rerun file-gc and objects-gc task for the same file once all snapshots are ;; Rerun file-gc and objects-gc task for the same file once all snapshots are
;; "expired/deleted" ;; "expired/deleted"
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 6 (:processed res)))) (t/is (= 6 (:processed res))))
(let [rows (th/db-query :file-data-fragment {:file-id (:id file) (let [rows (th/db-query :file-data {:file-id (:id file)
:type "fragment"
:deleted-at nil})] :deleted-at nil})]
(t/is (= (count rows) 1))) (t/is (= (count rows) 1)))
@@ -620,7 +612,7 @@
(t/is (nil? (sto/get-object storage (:media-id fmo2)))) (t/is (nil? (sto/get-object storage (:media-id fmo2))))
(t/is (nil? (sto/get-object storage (:media-id fmo1))))))) (t/is (nil? (sto/get-object storage (:media-id fmo1)))))))
(t/deftest file-gc-task-with-object-thumbnails (t/deftest file-gc-with-object-thumbnails
(letfn [(insert-file-object-thumbnail! [& {:keys [profile-id file-id page-id frame-id]}] (letfn [(insert-file-object-thumbnail! [& {:keys [profile-id file-id page-id frame-id]}]
(let [object-id (thc/fmt-object-id file-id page-id frame-id "frame") (let [object-id (thc/fmt-object-id file-id page-id frame-id "frame")
mfile {:filename "sample.jpg" mfile {:filename "sample.jpg"
@@ -704,11 +696,7 @@
(t/is (= 1 (:freeze res))) (t/is (= 1 (:freeze res)))
(t/is (= 0 (:delete res)))) (t/is (= 0 (:delete res))))
;; run the file-gc task immediately without forced min-age (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
(t/is (false? (th/run-task! :file-gc {:file-id (:id file)})))
;; run the task again
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)})))
;; retrieve file and check trimmed attribute ;; retrieve file and check trimmed attribute
(let [row (th/db-get :file {:id (:id file)})] (let [row (th/db-get :file {:id (:id file)})]
@@ -738,7 +726,7 @@
:page-id page-id :page-id page-id
:id frame-id-2}]) :id frame-id-2}])
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
(let [rows (th/db-query :file-tagged-object-thumbnail {:file-id file-id})] (let [rows (th/db-query :file-tagged-object-thumbnail {:file-id file-id})]
(t/is (= 2 (count rows))) (t/is (= 2 (count rows)))
@@ -772,7 +760,7 @@
:page-id page-id :page-id page-id
:id frame-id-1}]) :id frame-id-1}])
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
(let [rows (th/db-query :file-tagged-object-thumbnail {:file-id file-id})] (let [rows (th/db-query :file-tagged-object-thumbnail {:file-id file-id})]
(t/is (= 1 (count rows))) (t/is (= 1 (count rows)))
@@ -933,6 +921,8 @@
out (th/command! params)] out (th/command! params)]
(t/is (nil? (:error out)))) (t/is (nil? (:error out))))
(th/run-pending-tasks!)
;; query the list of files after soft deletion ;; query the list of files after soft deletion
(let [data {::th/type :get-project-files (let [data {::th/type :get-project-files
::rpc/profile-id (:id profile1) ::rpc/profile-id (:id profile1)
@@ -943,23 +933,24 @@
(let [result (:result out)] (let [result (:result out)]
(t/is (= 0 (count result))))) (t/is (= 0 (count result)))))
;; run permanent deletion (should be noop)
(let [result (th/run-task! :objects-gc {})]
(t/is (= 0 (:processed result))))
;; query the list of file libraries of a after hard deletion
(let [data {::th/type :get-file-libraries (let [data {::th/type :get-file-libraries
::rpc/profile-id (:id profile1) ::rpc/profile-id (:id profile1)
:file-id (:id file)} :file-id (:id file)}
out (th/command! data)] out (th/command! data)]
;; (th/print-result! out) ;; (th/print-result! out)
(t/is (nil? (:error out)))
(let [result (:result out)] (let [error (:error out)
(t/is (= 0 (count result))))) error-data (ex-data error)]
(t/is (th/ex-info? error))
(t/is (= (:type error-data) :not-found))))
;; run permanent deletion (should be noop)
(let [result (th/run-task! :objects-gc {})]
(t/is (= 0 (:processed result))))
;; run permanent deletion ;; run permanent deletion
(let [result (th/run-task! :objects-gc {:deletion-threshold (cf/get-deletion-delay)})] (let [result (th/run-task! :objects-gc {:deletion-threshold (cf/get-deletion-delay)})]
(t/is (= 1 (:processed result)))) (t/is (= 3 (:processed result))))
;; query the list of file libraries of a after hard deletion ;; query the list of file libraries of a after hard deletion
(let [data {::th/type :get-file-libraries (let [data {::th/type :get-file-libraries
@@ -972,7 +963,6 @@
(t/is (th/ex-info? error)) (t/is (th/ex-info? error))
(t/is (= (:type error-data) :not-found)))))) (t/is (= (:type error-data) :not-found))))))
(t/deftest object-thumbnails-ops (t/deftest object-thumbnails-ops
(let [prof (th/create-profile* 1 {:is-active true}) (let [prof (th/create-profile* 1 {:is-active true})
file (th/create-file* 1 {:profile-id (:id prof) file (th/create-file* 1 {:profile-id (:id prof)
@@ -1282,17 +1272,19 @@
:is-shared false}) :is-shared false})
page-id (uuid/random) page-id (uuid/random)
shape-id (uuid/random)] shape-id (uuid/random)
sobject (volatile! nil)]
;; Preventive file-gc ;; Preventive file-gc
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; Preventive objects-gc ;; Preventive objects-gc
(let [result (th/run-task! :objects-gc {})] (let [result (th/run-task! :objects-gc {})]
;; deletes the fragment created by file-gc
(t/is (= 1 (:processed result)))) (t/is (= 1 (:processed result))))
;; Check the number of fragments before adding the page ;; Check the number of fragments before adding the page
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(t/is (= 1 (count rows))) (t/is (= 1 (count rows)))
(t/is (every? #(some? (:data %)) rows))) (t/is (every? #(some? (:data %)) rows)))
@@ -1303,29 +1295,36 @@
;; Run FileGC again, with tiered storage activated ;; Run FileGC again, with tiered storage activated
(with-redefs [app.config/flags (conj app.config/flags :tiered-file-data-storage)] (with-redefs [app.config/flags (conj app.config/flags :tiered-file-data-storage)]
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)}))))
;; The FileGC task will schedule an inner taskq ;; The FileGC task will schedule an inner taskq
(th/run-pending-tasks!)) (th/run-pending-tasks!)
(let [res (th/run-task! :storage-gc-touched {:min-age 0})]
(t/is (= 2 (:freeze res)))
(t/is (= 0 (:delete res))))
;; Clean objects after file-gc ;; Clean objects after file-gc
(let [result (th/run-task! :objects-gc {})] (let [result (th/run-task! :objects-gc {})]
(t/is (= 1 (:processed result)))) (t/is (= 1 (:processed result))))
;; Check the number of fragments before adding the page ;; Check the number of fragments before adding the page
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})] (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
;; (pp/pprint rows)
(t/is (= 1 (count rows))) (t/is (= 1 (count rows)))
(t/is (every? #(nil? (:data %)) rows)) (t/is (every? #(nil? (:data %)) rows))
(t/is (every? #(uuid? (:data-ref-id %)) rows)) (t/is (every? #(= "storage" (:backend %)) rows)))
(t/is (every? #(= "objects-storage" (:data-backend %)) rows)))
(let [file (th/db-get :file {:id (:id file)}) (let [file (-> (th/db-get :file-data {:id (:id file) :type "main"})
(update :metadata fdata/decode-metadata))
storage (sto/resolve th/*system*)] storage (sto/resolve th/*system*)]
(t/is (= "objects-storage" (:data-backend file))) ;; (pp/pprint file)
(t/is (= "storage" (:backend file)))
(t/is (nil? (:data file))) (t/is (nil? (:data file)))
(t/is (uuid? (:data-ref-id file)))
(let [sobj (sto/get-object storage (:data-ref-id file))] (let [sobj (sto/get-object storage (-> file :metadata :storage-ref-id))]
(vreset! sobject sobj)
;; (pp/pprint (meta sobj))
(t/is (= "file-data" (:bucket (meta sobj)))) (t/is (= "file-data" (:bucket (meta sobj))))
(t/is (= (:id file) (:file-id (meta sobj)))))) (t/is (= (:id file) (:file-id (meta sobj))))))
@@ -1340,36 +1339,68 @@
:name "test" :name "test"
:id page-id}]) :id page-id}])
;; Check the number of fragments
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})]
(t/is (= 2 (count rows))))
;; Check the number of fragments ;; Check the number of fragments
(let [[row1 row2 :as rows] (let [[row1 row2 :as rows]
(th/db-query :file-data-fragment (th/db-query :file-data
{:file-id (:id file) {:file-id (:id file)
:deleted-at nil} :type "fragment"}
{:order-by [:created-at]})] {:order-by [:created-at]})]
;; (pp/pprint rows) ;; (pp/pprint rows)
(t/is (= 2 (count rows))) (t/is (= 2 (count rows)))
(t/is (nil? (:data row1))) (t/is (nil? (:data row1)))
(t/is (= "objects-storage" (:data-backend row1))) (t/is (= "storage" (:backend row1)))
(t/is (bytes? (:data row2))) (t/is (bytes? (:data row2)))
(t/is (nil? (:data-backend row2)))) (t/is (= "db" (:backend row2))))
;; The file-gc should mark for remove unused fragments ;; The file-gc should mark for remove unused fragments
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; The objects-gc should remove unused fragments ;; The file-gc task, recreates all fragments, so after it we have
;; now the double of fragments, and the old ones are marked as
;; deleted, and the new ones are on DB
(let [[row1 row2 row3 row4 :as rows]
(th/db-query :file-data
{:file-id (:id file)
:type "fragment"}
{:order-by [:created-at]})]
;; (pp/pprint rows)
(t/is (= 4 (count rows)))
(t/is (nil? (:data row1)))
(t/is (ct/inst? (:deleted-at row1)))
(t/is (= "storage" (:backend row1)))
(t/is (bytes? (:data row2)))
(t/is (= "db" (:backend row2)))
(t/is (ct/inst? (:deleted-at row2)))
(t/is (bytes? (:data row3)))
(t/is (= "db" (:backend row3)))
(t/is (nil? (:deleted-at row3)))
(t/is (bytes? (:data row4)))
(t/is (= "db" (:backend row4)))
(t/is (nil? (:deleted-at row4))))
;; The objects-gc should remove the marked to delete fragments
(let [res (th/run-task! :objects-gc {})] (let [res (th/run-task! :objects-gc {})]
(t/is (= 2 (:processed res)))) (t/is (= 2 (:processed res))))
;; Check the number of fragments before adding the page (let [rows (th/db-query :file-data {:file-id (:id file) :type "fragment"})]
(let [rows (th/db-query :file-data-fragment {:file-id (:id file)})]
(t/is (= 2 (count rows))) (t/is (= 2 (count rows)))
(t/is (every? #(bytes? (:data %)) rows)) (t/is (every? #(bytes? (:data %)) rows))
(t/is (every? #(nil? (:data-ref-id %)) rows)) (t/is (every? #(= "db" (:backend %)) rows)))
(t/is (every? #(nil? (:data-backend %)) rows)))))
;; we ensure that once object-gc is passed and marked two storage
;; objects to delete
(let [res (th/run-task! :storage-gc-touched {:min-age 0})]
(t/is (= 0 (:freeze res)))
(t/is (= 2 (:delete res))))
(let [storage (sto/resolve th/*system*)]
(t/is (uuid? (:id @sobject)))
(t/is (nil? (sto/get-object storage (:id @sobject)))))))
(t/deftest file-gc-with-components-1 (t/deftest file-gc-with-components-1
(let [storage (:app.storage/storage th/*system*) (let [storage (:app.storage/storage th/*system*)
@@ -1384,7 +1415,8 @@
page-id (first (get-in file [:data :pages]))] page-id (first (get-in file [:data :pages]))]
(let [rows (th/db-query :file-data-fragment {:file-id (:id file) (let [rows (th/db-query :file-data {:file-id (:id file)
:type "fragment"
:deleted-at nil})] :deleted-at nil})]
(t/is (= (count rows) 1))) (t/is (= (count rows) 1)))
@@ -1437,11 +1469,8 @@
:id c-id :id c-id
:anotation nil}]) :anotation nil}])
;; Run the file-gc task immediately without forced min-age
(t/is (false? (th/run-task! :file-gc {:file-id (:id file)})))
;; Run the task again ;; Run the task again
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file)})))
;; Retrieve file and check trimmed attribute ;; Retrieve file and check trimmed attribute
(let [row (th/db-get :file {:id (:id file)})] (let [row (th/db-get :file {:id (:id file)})]
@@ -1651,8 +1680,7 @@
(t/is (some? (not-empty (:objects component)))))) (t/is (some? (not-empty (:objects component))))))
;; Re-run the file-gc task ;; Re-run the file-gc task
(t/is (true? (th/run-task! :file-gc {:min-age 0 :file-id (:id file-1)}))) (t/is (true? (th/run-task! :file-gc {:file-id (:id file-1)})))
(t/is (false? (th/run-task! :file-gc {:min-age 0 :file-id (:id file-2)})))
;; Check that component is still there after file-gc task ;; Check that component is still there after file-gc task
(let [data {::th/type :get-file (let [data {::th/type :get-file

View File

@@ -39,8 +39,6 @@
(t/is (nil? (:error out))) (t/is (nil? (:error out)))
(:result out))) (:result out)))
;; TODO: migrate to commands
(t/deftest duplicate-file (t/deftest duplicate-file
(let [storage (-> (:app.storage/storage th/*system*) (let [storage (-> (:app.storage/storage th/*system*)
(configure-storage-backend)) (configure-storage-backend))
@@ -233,15 +231,7 @@
;; check that the both files are equivalent ;; check that the both files are equivalent
(doseq [[fa fb] (map vector p1-files p2-files)] (doseq [[fa fb] (map vector p1-files p2-files)]
(t/is (not= (:id fa) (:id fb))) (t/is (not= (:id fa) (:id fb)))
(t/is (= (:name fa) (:name fb))) (t/is (= (:name fa) (:name fb)))))))))
(when (= (:id fa) (:id file1))
(t/is (false? (b/equals? (:data fa)
(:data fb)))))
(when (= (:id fa) (:id file2))
(t/is (false? (b/equals? (:data fa)
(:data fb)))))))))))
(t/deftest duplicate-project-with-deleted-files (t/deftest duplicate-project-with-deleted-files
(let [storage (-> (:app.storage/storage th/*system*) (let [storage (-> (:app.storage/storage th/*system*)
@@ -297,15 +287,7 @@
;; check that the both files are equivalent ;; check that the both files are equivalent
(doseq [[fa fb] (map vector (rest p1-files) p2-files)] (doseq [[fa fb] (map vector (rest p1-files) p2-files)]
(t/is (not= (:id fa) (:id fb))) (t/is (not= (:id fa) (:id fb)))
(t/is (= (:name fa) (:name fb))) (t/is (= (:name fa) (:name fb)))))))))
(when (= (:id fa) (:id file1))
(t/is (false? (b/equals? (:data fa)
(:data fb)))))
(when (= (:id fa) (:id file2))
(t/is (false? (b/equals? (:data fa)
(:data fb)))))))))))
(t/deftest move-file-on-same-team (t/deftest move-file-on-same-team
(let [profile (th/create-profile* 1 {:is-active true}) (let [profile (th/create-profile* 1 {:is-active true})

View File

@@ -162,7 +162,7 @@
;; execute permanent deletion task ;; execute permanent deletion task
(let [result (th/run-task! :objects-gc {:min-age 0})] (let [result (th/run-task! :objects-gc {:min-age 0})]
(t/is (= 4 (:processed result)))) (t/is (= 6 (:processed result))))
(let [row (th/db-get :team (let [row (th/db-get :team
{:id (:default-team-id prof)} {:id (:default-team-id prof)}
@@ -324,7 +324,7 @@
;; execute permanent deletion task ;; execute permanent deletion task
(let [result (th/run-task! :objects-gc {:min-age 0})] (let [result (th/run-task! :objects-gc {:min-age 0})]
(t/is (= 4 (:processed result)))) (t/is (= 6 (:processed result))))
(let [row (th/db-get :team (let [row (th/db-get :team
{:id (:default-team-id prof1)} {:id (:default-team-id prof1)}
@@ -363,7 +363,7 @@
;; execute permanent deletion task ;; execute permanent deletion task
(let [result (th/run-task! :objects-gc {:min-age 0})] (let [result (th/run-task! :objects-gc {:min-age 0})]
(t/is (= 8 (:processed result)))))) (t/is (= 10 (:processed result))))))
(t/deftest email-blacklist-1 (t/deftest email-blacklist-1
@@ -514,8 +514,7 @@
(t/is (= 0 (:call-count @mock)))))))) (t/is (= 0 (:call-count @mock))))))))
(t/deftest prepare-and-register-with-invitation-and-enabled-registration-1 (t/deftest prepare-and-register-with-invitation-and-enabled-registration-1
(let [sprops (:app.setup/props th/*system*) (let [itoken (tokens/generate th/*system*
itoken (tokens/generate sprops
{:iss :team-invitation {:iss :team-invitation
:exp (ct/in-future "48h") :exp (ct/in-future "48h")
:role :editor :role :editor
@@ -543,8 +542,7 @@
(t/is (string? (:invitation-token result)))))) (t/is (string? (:invitation-token result))))))
(t/deftest prepare-and-register-with-invitation-and-enabled-registration-2 (t/deftest prepare-and-register-with-invitation-and-enabled-registration-2
(let [sprops (:app.setup/props th/*system*) (let [itoken (tokens/generate th/*system*
itoken (tokens/generate sprops
{:iss :team-invitation {:iss :team-invitation
:exp (ct/in-future "48h") :exp (ct/in-future "48h")
:role :editor :role :editor
@@ -565,8 +563,7 @@
(t/deftest prepare-and-register-with-invitation-and-disabled-registration-1 (t/deftest prepare-and-register-with-invitation-and-disabled-registration-1
(with-redefs [app.config/flags [:disable-registration]] (with-redefs [app.config/flags [:disable-registration]]
(let [sprops (:app.setup/props th/*system*) (let [itoken (tokens/generate th/*system*
itoken (tokens/generate sprops
{:iss :team-invitation {:iss :team-invitation
:exp (ct/in-future "48h") :exp (ct/in-future "48h")
:role :editor :role :editor
@@ -586,8 +583,7 @@
(t/deftest prepare-and-register-with-invitation-and-disabled-registration-2 (t/deftest prepare-and-register-with-invitation-and-disabled-registration-2
(with-redefs [app.config/flags [:disable-registration]] (with-redefs [app.config/flags [:disable-registration]]
(let [sprops (:app.setup/props th/*system*) (let [itoken (tokens/generate th/*system*
itoken (tokens/generate sprops
{:iss :team-invitation {:iss :team-invitation
:exp (ct/in-future "48h") :exp (ct/in-future "48h")
:role :editor :role :editor
@@ -608,8 +604,7 @@
(t/deftest prepare-and-register-with-invitation-and-disabled-login-with-password (t/deftest prepare-and-register-with-invitation-and-disabled-login-with-password
(with-redefs [app.config/flags [:disable-login-with-password]] (with-redefs [app.config/flags [:disable-login-with-password]]
(let [sprops (:app.setup/props th/*system*) (let [itoken (tokens/generate th/*system*
itoken (tokens/generate sprops
{:iss :team-invitation {:iss :team-invitation
:exp (ct/in-future "48h") :exp (ct/in-future "48h")
:role :editor :role :editor

View File

@@ -208,8 +208,6 @@
profile2 (th/create-profile* 2 {:is-active true}) profile2 (th/create-profile* 2 {:is-active true})
team (th/create-team* 1 {:profile-id (:id profile1)}) team (th/create-team* 1 {:profile-id (:id profile1)})
sprops (:app.setup/props th/*system*)
pool (:app.db/pool th/*system*)] pool (:app.db/pool th/*system*)]
;; Try to invite a not existing user ;; Try to invite a not existing user
@@ -226,7 +224,7 @@
(t/is (= 1 (-> out :result :total))) (t/is (= 1 (-> out :result :total)))
(let [token (-> out :result :invitations first) (let [token (-> out :result :invitations first)
claims (tokens/decode sprops token)] claims (tokens/decode th/*system* token)]
(t/is (= :team-invitation (:iss claims))) (t/is (= :team-invitation (:iss claims)))
(t/is (= (:id profile1) (:profile-id claims))) (t/is (= (:id profile1) (:profile-id claims)))
(t/is (= :editor (:role claims))) (t/is (= :editor (:role claims)))
@@ -250,7 +248,7 @@
(t/is (= 1 (-> out :result :total))) (t/is (= 1 (-> out :result :total)))
(let [token (-> out :result :invitations first) (let [token (-> out :result :invitations first)
claims (tokens/decode sprops token)] claims (tokens/decode th/*system* token)]
(t/is (= :team-invitation (:iss claims))) (t/is (= :team-invitation (:iss claims)))
(t/is (= (:id profile1) (:profile-id claims))) (t/is (= (:id profile1) (:profile-id claims)))
(t/is (= :editor (:role claims))) (t/is (= :editor (:role claims)))
@@ -266,10 +264,9 @@
team (th/create-team* 1 {:profile-id (:id profile1)}) team (th/create-team* 1 {:profile-id (:id profile1)})
sprops (:app.setup/props th/*system*)
pool (:app.db/pool th/*system*)] pool (:app.db/pool th/*system*)]
(let [token (tokens/generate sprops (let [token (tokens/generate th/*system*
{:iss :team-invitation {:iss :team-invitation
:exp (ct/in-future "1h") :exp (ct/in-future "1h")
:profile-id (:id profile1) :profile-id (:id profile1)
@@ -585,7 +582,7 @@
(t/is (ct/inst? (:deleted-at (first rows))))) (t/is (ct/inst? (:deleted-at (first rows)))))
(let [result (th/run-task! :objects-gc {:deletion-threshold (cf/get-deletion-delay)})] (let [result (th/run-task! :objects-gc {:deletion-threshold (cf/get-deletion-delay)})]
(t/is (= 5 (:processed result)))))) (t/is (= 7 (:processed result))))))
(t/deftest create-team-access-request (t/deftest create-team-access-request
(with-mocks [mock {:target 'app.email/send! :return nil}] (with-mocks [mock {:target 'app.email/send! :return nil}]

View File

@@ -6,26 +6,28 @@
org.clojure/data.fressian {:mvn/version "1.1.0"} org.clojure/data.fressian {:mvn/version "1.1.0"}
org.clojure/clojurescript {:mvn/version "1.12.42"} org.clojure/clojurescript {:mvn/version "1.12.42"}
org.apache.commons/commons-pool2 {:mvn/version "2.12.1"}
;; Logging ;; Logging
org.apache.logging.log4j/log4j-api {:mvn/version "2.24.3"} org.apache.logging.log4j/log4j-api {:mvn/version "2.25.1"}
org.apache.logging.log4j/log4j-core {:mvn/version "2.24.3"} org.apache.logging.log4j/log4j-core {:mvn/version "2.25.1"}
org.apache.logging.log4j/log4j-web {:mvn/version "2.24.3"} org.apache.logging.log4j/log4j-web {:mvn/version "2.25.1"}
org.apache.logging.log4j/log4j-jul {:mvn/version "2.24.3"} org.apache.logging.log4j/log4j-jul {:mvn/version "2.25.1"}
org.apache.logging.log4j/log4j-slf4j2-impl {:mvn/version "2.24.3"} org.apache.logging.log4j/log4j-slf4j2-impl {:mvn/version "2.25.1"}
org.slf4j/slf4j-api {:mvn/version "2.0.17"} org.slf4j/slf4j-api {:mvn/version "2.0.17"}
pl.tkowalcz.tjahzi/log4j2-appender {:mvn/version "0.9.32"} pl.tkowalcz.tjahzi/log4j2-appender {:mvn/version "0.9.40"}
selmer/selmer {:mvn/version "1.12.62"} selmer/selmer {:mvn/version "1.12.62"}
criterium/criterium {:mvn/version "0.4.6"} criterium/criterium {:mvn/version "0.4.6"}
metosin/jsonista {:mvn/version "0.3.13"} metosin/jsonista {:mvn/version "0.3.13"}
metosin/malli {:mvn/version "0.18.0"} metosin/malli {:mvn/version "0.19.1"}
expound/expound {:mvn/version "0.9.0"} expound/expound {:mvn/version "0.9.0"}
com.cognitect/transit-clj {:mvn/version "1.0.333"} com.cognitect/transit-clj {:mvn/version "1.0.333"}
com.cognitect/transit-cljs {:mvn/version "0.8.280"} com.cognitect/transit-cljs {:mvn/version "0.8.280"}
java-http-clj/java-http-clj {:mvn/version "0.4.3"} java-http-clj/java-http-clj {:mvn/version "0.4.3"}
integrant/integrant {:mvn/version "0.13.1"} integrant/integrant {:mvn/version "1.0.0"}
funcool/tubax {:mvn/version "2021.05.20-0"} funcool/tubax {:mvn/version "2021.05.20-0"}
funcool/cuerdas {:mvn/version "2025.06.16-414"} funcool/cuerdas {:mvn/version "2025.06.16-414"}
@@ -47,7 +49,7 @@
org.la4j/la4j {:mvn/version "0.6.0"} org.la4j/la4j {:mvn/version "0.6.0"}
;; exception printing ;; exception printing
fipp/fipp {:mvn/version "0.6.27"} fipp/fipp {:mvn/version "0.6.29"}
me.flowthing/pp {:mvn/version "2024-11-13.77"} me.flowthing/pp {:mvn/version "2024-11-13.77"}
@@ -59,7 +61,7 @@
{:dev {:dev
{:extra-deps {:extra-deps
{org.clojure/tools.namespace {:mvn/version "RELEASE"} {org.clojure/tools.namespace {:mvn/version "RELEASE"}
thheller/shadow-cljs {:mvn/version "3.1.5"} thheller/shadow-cljs {:mvn/version "3.2.0"}
com.clojure-goes-fast/clj-async-profiler {:mvn/version "RELEASE"} com.clojure-goes-fast/clj-async-profiler {:mvn/version "RELEASE"}
com.bhauman/rebel-readline {:mvn/version "RELEASE"} com.bhauman/rebel-readline {:mvn/version "RELEASE"}
criterium/criterium {:mvn/version "RELEASE"} criterium/criterium {:mvn/version "RELEASE"}
@@ -68,7 +70,7 @@
:build :build
{:extra-deps {:extra-deps
{io.github.clojure/tools.build {:git/tag "v0.10.9" :git/sha "e405aac"}} {io.github.clojure/tools.build {:mvn/version "0.10.10"}}
:ns-default build} :ns-default build}
:test :test

View File

@@ -50,6 +50,13 @@
(let [target (with-meta target {:tag 'java.nio.ByteBuffer})] (let [target (with-meta target {:tag 'java.nio.ByteBuffer})]
`(long (.getInt ~target (unchecked-int ~offset)))))) `(long (.getInt ~target (unchecked-int ~offset))))))
(defmacro read-long
[target offset]
(if (:ns &env)
`(.getInt64 ~target ~offset true)
(let [target (with-meta target {:tag 'java.nio.ByteBuffer})]
`(.getLong ~target (unchecked-int ~offset)))))
(defmacro read-float (defmacro read-float
[target offset] [target offset]
(if (:ns &env) (if (:ns &env)
@@ -75,6 +82,40 @@
(finally (finally
(.order ~target ByteOrder/LITTLE_ENDIAN)))))) (.order ~target ByteOrder/LITTLE_ENDIAN))))))
(defmacro read-bytes
"Get a byte array from buffer. It is potentially unsafe because on
JS/CLJS it returns a subarray without doing any copy of data."
[target offset size]
(if (:ns &env)
`(new js/Uint8Array
(.-buffer ~target)
(+ (.-byteOffset ~target) ~offset)
~size)
(let [target (with-meta target {:tag 'java.nio.ByteBuffer})
bbuf (with-meta (gensym "bbuf") {:tag bytes})]
`(let [~bbuf (byte-array ~size)]
(.get ~target
(unchecked-int ~offset)
~bbuf
0
~size)
~bbuf))))
;; FIXME: implement in cljs
(defmacro write-bytes
([target offset src size]
`(write-bytes ~target ~offset ~src 0 ~size))
([target offset src src-offset size]
(if (:ns &env)
(throw (ex-info "not implemented" {}))
(let [target (with-meta target {:tag 'java.nio.ByteBuffer})
src (with-meta src {:tag 'bytes})]
`(.put ~target
(unchecked-int ~offset)
~src
(unchecked-int ~src-offset)
(unchecked-int ~size))))))
(defmacro write-byte (defmacro write-byte
[target offset value] [target offset value]
(if (:ns &env) (if (:ns &env)
@@ -144,13 +185,15 @@
(.setUint32 ~target (+ ~offset 12) (aget barray# 3) true)) (.setUint32 ~target (+ ~offset 12) (aget barray# 3) true))
(let [target (with-meta target {:tag 'java.nio.ByteBuffer}) (let [target (with-meta target {:tag 'java.nio.ByteBuffer})
value (with-meta value {:tag 'java.util.UUID})] value (with-meta value {:tag 'java.util.UUID})
`(try prev (with-meta (gensym "prev-") {:tag 'java.nio.ByteOrder})]
`(let [~prev (.order ~target)]
(try
(.order ~target ByteOrder/BIG_ENDIAN) (.order ~target ByteOrder/BIG_ENDIAN)
(.putLong ~target (unchecked-int (+ ~offset 0)) (.getMostSignificantBits ~value)) (.putLong ~target (unchecked-int (+ ~offset 0)) (.getMostSignificantBits ~value))
(.putLong ~target (unchecked-int (+ ~offset 8)) (.getLeastSignificantBits ~value)) (.putLong ~target (unchecked-int (+ ~offset 8)) (.getLeastSignificantBits ~value))
(finally (finally
(.order ~target ByteOrder/LITTLE_ENDIAN)))))) (.order ~target ~prev)))))))
(defn wrap (defn wrap
[data] [data]
@@ -160,7 +203,7 @@
(defn allocate (defn allocate
[size] [size]
#?(:clj (let [buffer (ByteBuffer/allocate (int size))] #?(:clj (let [buffer (ByteBuffer/allocate (unchecked-int size))]
(.order buffer ByteOrder/LITTLE_ENDIAN)) (.order buffer ByteOrder/LITTLE_ENDIAN))
:cljs (new js/DataView (new js/ArrayBuffer size)))) :cljs (new js/DataView (new js/ArrayBuffer size))))
@@ -181,6 +224,14 @@
(.set dst-view src-view) (.set dst-view src-view)
(js/DataView. dst-buff)))) (js/DataView. dst-buff))))
;; FIXME: cljs impl
#?(:clj
(defn copy-bytes
[src src-offset size dst dst-offset]
(let [tmp (byte-array size)]
(.get ^ByteBuffer src src-offset tmp 0 size)
(.put ^ByteBuffer dst dst-offset tmp 0 size))))
(defn equals? (defn equals?
[buffer-a buffer-b] [buffer-a buffer-b]
#?(:clj #?(:clj
@@ -208,3 +259,18 @@
[o] [o]
#?(:clj (instance? ByteBuffer o) #?(:clj (instance? ByteBuffer o)
:cljs (instance? js/DataView o))) :cljs (instance? js/DataView o)))
(defn slice
[buffer offset size]
#?(:cljs
(let [offset (+ (.-byteOffset buffer) offset)]
(new js/DataView (.-buffer buffer) offset size))
:clj
(-> (.slice ^ByteBuffer buffer (unchecked-int offset) (unchecked-int size))
(.order ByteOrder/LITTLE_ENDIAN))))
(defn size
[o]
#?(:cljs (.-byteLength ^js o)
:clj (.capacity ^ByteBuffer o)))

Some files were not shown because too many files have changed in this diff Show More