Compare commits

...

233 Commits

Author SHA1 Message Date
NarayanBavisetti
f078a0bb5a chore: rendered the user timezone 2024-11-19 16:30:03 +05:30
sriram veeraghanta
c1ac6e4244 chore: removing dependabot updates alerts 2024-11-18 12:06:13 +05:30
dependabot[bot]
6d98619082 chore(deps): bump actions/checkout from 3 to 4 (#6005)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 19:42:08 +05:30
dependabot[bot]
52d3169542 chore(deps): bump softprops/action-gh-release from 2.0.8 to 2.1.0 (#6010)
Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 2.0.8 to 2.1.0.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](https://github.com/softprops/action-gh-release/compare/v2.0.8...v2.1.0)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 19:40:41 +05:30
dependabot[bot]
5989b1a134 chore(deps): bump github/codeql-action from 2 to 3 (#6011)
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 2 to 3.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/github/codeql-action/compare/v2...v3)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 19:39:31 +05:30
sriram veeraghanta
291bb5c899 Merge branch 'preview' of github.com:makeplane/plane into preview 2024-11-16 19:37:22 +05:30
sriram veeraghanta
2ef00efaab fix: tubro repo upgrade 2024-11-16 19:37:06 +05:30
dependabot[bot]
c5f96466e9 chore(deps): bump cross-spawn in the npm_and_yarn group (#6038)
Bumps the npm_and_yarn group with 1 update: [cross-spawn](https://github.com/moxystudio/node-cross-spawn).


Updates `cross-spawn` from 7.0.3 to 7.0.5
- [Changelog](https://github.com/moxystudio/node-cross-spawn/blob/master/CHANGELOG.md)
- [Commits](https://github.com/moxystudio/node-cross-spawn/compare/v7.0.3...v7.0.5)

---
updated-dependencies:
- dependency-name: cross-spawn
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 18:40:01 +05:30
sriram veeraghanta
35938b57af fix: dependabot security patch only 2024-11-16 18:36:47 +05:30
dependabot[bot]
1b1b160c04 chore(deps): bump docker/build-push-action from 5.1.0 to 6.9.0 (#6004)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.1.0 to 6.9.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v5.1.0...v6.9.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 18:30:44 +05:30
sriram veeraghanta
4149e84e62 Create dependabot.yml (#6002) 2024-11-16 18:25:29 +05:30
Aaryan Khandelwal
9408e92e44 Revert "[WEB-1435] dev: conflict free issue descriptions (#5912)" (#6000)
This reverts commit e9680cab74.
2024-11-15 17:13:31 +05:30
Aaryan Khandelwal
e9680cab74 [WEB-1435] dev: conflict free issue descriptions (#5912)
* chore: new description binary endpoints

* chore: conflict free issue description

* chore: fix submitting status

* chore: update yjs utils

* chore: handle component re-mounting

* chore: update buffer response type

* chore: add try catch for issue description update

* chore: update buffer response type

* chore: description binary in retrieve

* chore: update issue description hook

* chore: decode description binary

* chore: migrations fixes and cleanup

* chore: migration fixes

* fix: inbox issue description

* chore: move update operations to the issue store

* fix: merge conflicts

* chore: reverted the commit

* chore: removed the unwanted imports

* chore: remove unnecessary props

* chore: remove unused services

* chore: update live server error handling

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-11-15 16:38:58 +05:30
sriram veeraghanta
229610513a fix: django instrumentation fixes 2024-11-13 21:04:16 +05:30
sriram veeraghanta
f9d9c92c83 fix: opentelemetry sdk package update 2024-11-13 20:27:47 +05:30
Aaryan Khandelwal
89588d4451 fix: issue and module link validation (#5994)
* fix: issue and module link validation

* chore: removed reset logic
2024-11-13 19:47:30 +05:30
Akshita Goyal
3eb911837c fix: display property in take (#5993) 2024-11-13 18:02:24 +05:30
rahulramesha
4b50b27a74 [WEB-2442] feat: Minor Timeline view Enhancements (#5987)
* fix timeline scroll to the right in some cases

(cherry picked from commit 17043a6c7f)

* add get position based on Date

(cherry picked from commit 2fbe22d689)

* Add sticky block name to enable it to be read throughout the block regardless of scroll position

(cherry picked from commit 447af2e05a)

* Enable blocks to have a single date on the block charts

(cherry picked from commit cb055d566b)

* revert back date-range changes

* change gradient of half blocks on Timeline

* Add instance Id for Timeline Sidebar dragging to avoid enabling dropping of other drag instances

* fix timeline scrolling height
2024-11-13 15:40:37 +05:30
rahulramesha
f44db89f41 [WEB-2628] fix: Sorting by estimates (#5988)
* fix estimates sorting in Front end side

* change estimate sorting keys

* - Fix estimate sorting when local db is enabled
- Fix a bug with with sorting on special fields on spreadsheet layout
- Cleanup logging

* Add logic for order by based on layout for special cases of no load

---------

Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
2024-11-13 15:38:43 +05:30
Akshita Goyal
8c3189e1be fix: intake status count (#5990) 2024-11-13 15:38:03 +05:30
sriram veeraghanta
eee2145734 fix: code spliting and instance maintenance screens 2024-11-12 19:48:31 +05:30
Aaryan Khandelwal
106710f3d0 fix: custom background color for table header (#5989) 2024-11-12 15:26:57 +05:30
Anmol Singh Bhatia
db8c4f92e8 chore: theme and code refactor (#5983)
* chore: added pi colors

* chore: de-dupe modal height

---------

Co-authored-by: gakshita <akshitagoyal1516@gmail.com>
2024-11-11 19:53:43 +05:30
Anmol Singh Bhatia
a6cc2c93f8 chore: worklog enhancements (#5982) 2024-11-11 19:27:07 +05:30
Bavisetti Narayan
0428ea06f6 chore: filter the deleted issue assignee (#5984) 2024-11-11 19:25:38 +05:30
Aaryan Khandelwal
7082f7014d style: remove unnecessary bottom padding from the rich text editor (#5976) 2024-11-11 16:11:34 +05:30
Anmol Singh Bhatia
c7c729d81b [WEB-2283] fix: create issue modal parent select ui (#5980)
* fix: create issue modal parent select ui

* chore: code refactor
2024-11-11 16:11:10 +05:30
Aaryan Khandelwal
97eb8d43d4 style: updated margins and font styles for editor (#5978)
* style: updated margins and font styles for editor

* fix: code block font size in small font

* fix: remove duplicate code
2024-11-11 16:10:47 +05:30
Anmol Singh Bhatia
1217af1d5f chore: restrict sub-issue to have different project id than parent (#5981) 2024-11-11 16:10:27 +05:30
Bavisetti Narayan
13083a77eb chore: enable intake from project settings (#5977) 2024-11-09 17:01:21 +05:30
Akshita Goyal
0cd36b854e fix: intake loading (#5966)
* fix: intake loading

* fix: image upload in space
2024-11-08 17:17:15 +05:30
Bavisetti Narayan
1d314dd25f fix: renamed inbox to intake (#5967)
* feat: intake

* chore: intake model migration changes

* dev: update dummy data

* dev: add duplicate apis for inbox

* dev: fix external apis

* fix: external apis

* chore: migration file changes

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-11-08 17:10:24 +05:30
rahulramesha
1743717351 fix related to activity (#5972) 2024-11-08 17:09:49 +05:30
Satish Gandham
acba451803 [WEB-2706] fix: Add fallback when db initialisation fails (#5973)
* Add fallback when db initialization fails

* add checks for instance.exec

* chore: convert issue boolean fields to actual boolean value.

* change instance exec code

* sync issue to local db when inbox issue is accepted and draft issue is moved to project

* chore: added project and workspace keys

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-11-08 17:09:26 +05:30
Aaryan Khandelwal
2193e8c79c fix: editor user config (#5974) 2024-11-08 13:30:06 +05:30
Anmol Singh Bhatia
4c6ab984c3 [WEB-2742] chore: issue link ui revamp (#5971)
* chore-issue-link-ui

* chore: issue link ui revamp
2024-11-07 19:24:15 +05:30
Prateek Shourya
7574206a41 [WEB-2554] improvement: dashboard sidebar list items. (#5970) 2024-11-07 15:31:28 +05:30
Anmol Singh Bhatia
eebc327b10 chore: app sidebar behaviour (#5964) 2024-11-06 18:36:23 +05:30
Prateek Shourya
e19cb012be [WEB-2728] improvement: add true-transparent variant for textarea. (#5960) 2024-11-06 16:56:15 +05:30
guru_sainath
9d1253a61d chore: infra update for maintenance mode (#5963) 2024-11-06 15:13:51 +05:30
Bavisetti Narayan
56755b0e9c chore: intake migration (#5950)
* chore: intake migration

* chore: removed the enum

* chore: removed the source type enum

* chore: changed the migration file
2024-11-05 19:21:20 +05:30
rahulramesha
438d1bcfbd add missing config to get issues api call (#5955) 2024-11-05 17:50:23 +05:30
Akshita Goyal
45a5cf5119 fix: editor height (#5953)
* fix: editor height

* fix: removed unwanted class

* fix: editor height
2024-11-05 17:47:39 +05:30
Aaryan Khandelwal
b4de055463 [PULSE-42] feat: text alignment for all editors (#5847)
* feat: text alignment for editors

* fix: text alignment types

* fix: build errors

* fix: build error

* fix: toolbar movement post alignment selection

* fix: callout type

* fix: image node types

* chore: add ts error warning
2024-11-05 17:46:34 +05:30
Aaryan Khandelwal
bb311b750f fix: wrong token being passed in the read-only editor (#5954)
* fix: wrong token

* chore: update useMemo dependencies
2024-11-05 17:45:53 +05:30
Anmol Singh Bhatia
ea8583b2d4 chore: code refactor (#5952)
* chore: code refactor

* chore: code refactor
2024-11-05 17:04:03 +05:30
Akshita Goyal
eed2ca77ef fix: added workspaceslug in renderChildren of project settings (#5951)
* fix: added workspaceslug in renderChildren of project settings

* fix: updated apis

* fix: types

* fix: added editor

* fix: handled avatar for intake
2024-11-05 16:07:27 +05:30
Akshita Goyal
9309d1b574 feat: Pi chat (#5933)
* fix: added pi chat

* fix: added bot

* fix: removed pi chat from community version

* fix: removed unwanted files

* fix: removed unused import
2024-11-05 15:16:58 +05:30
Aaryan Khandelwal
f205d72782 fix: floating toolbar max width (#5949) 2024-11-04 20:17:20 +05:30
rahulramesha
3d2fe7841f fix issues fetching while changing filters by making sure to pass the abort controller config to apis (#5948) 2024-11-04 20:16:56 +05:30
rahulramesha
71589f93ca [WEB-2442] fix : Timeline layout bugs (#5946)
* fix relation creation and removal for Issue relations

* fix Scrolling to block when the block is beyond current chart's limits

* fix dark mode for timeline layout

* use a hook to get the current relations available in the environment, instead of directly importing it

* Update relation activity for all the relations
2024-11-04 16:55:38 +05:30
Satish Gandham
a1bfde6af9 [WEB-2706] fix: Fix issue with SQLite transactions (#5934)
* - Fix transaction within transaction issue
- Close DB handles on reload
- Fix GET_ISSUES tracking

* Cleanup stray code

* Fix lint error

* Possible fix for NoModificationAllowedError
2024-11-04 16:54:13 +05:30
Lakhan Baheti
20b2a70939 fix: global css conflict (#5945) 2024-11-04 16:15:17 +05:30
Prateek Shourya
914811b643 fix: build error for product updates modal. (#5944) 2024-11-04 14:04:59 +05:30
Nikhil
0dead39fd1 chore: device migration (#5939)
* chore: device migration

* chore: devices

* chore: update device migrations

* chore: update migration

* chore: update migrations

* chore: update device migrations
2024-11-01 22:40:39 +05:30
sriram veeraghanta
27d7d91185 fix: new set of migrations in db models 2024-11-01 21:24:57 +05:30
Lakhan Baheti
3696062372 [WEB-2730] chore: core/editor updates to support mobile editor (#5910)
* added editor changes w.r.t mobile-editor

* added external extensions option

* fix: type errors in image block

* added on transaction method

* fix: optional prop fixed

* fix: memoize the extensions array

* fix: added missing deps

* fix: image component types

* fix: remove range prop

* fix: type fixes and better names of img src

* fix: image load blinking

* fix: code review

* fix: props code review

* fix: coderabbit review

---------

Co-authored-by: Palanikannan M <akashmalinimurugu@gmail.com>
2024-10-30 17:39:02 +05:30
Lakhan Baheti
8ea34b5995 [WEB-2729] chore: updated live server auth cookies handling (#5913)
* chore: updated live server auth cookies handling

* chore: update token parsing logic

* fix: types and better logical seperation between the existing two tokens

* fix: better fallback to use request headers for cookies

---------

Co-authored-by: Palanikannan M <akashmalinimurugu@gmail.com>
2024-10-30 17:38:29 +05:30
Bavisetti Narayan
403482fa6e fix: workspace user property migration (#5908)
* fix: workspace user property migration

* fix: issue relations migration
2024-10-30 13:52:14 +05:30
Nikhil
fe18eae8cd fix: integrity error on account creation (#5876)
* fix: integrity error on account creation

* fix: exception handling
2024-10-30 13:46:05 +05:30
rahulramesha
3f429a1dab minor build fix (#5929) 2024-10-29 20:51:56 +05:30
Ketan Sharma
22b616b03c [WEB-2449] fix: admin is not able to edit issues in notifications peek overview (#5877)
* fix backend

* fix missing arguments for allow permissions

* Revert "fix backend"

This reverts commit 208636d7c8.
2024-10-29 19:46:20 +05:30
Anmol Singh Bhatia
57eb08c8a2 chore: code refactoring (#5928)
* chore: de dupe code splitting

* chore: code refactor
2024-10-29 19:39:55 +05:30
Prateek Shourya
4bc751b7ab [WEB-2500] feat: Product updates modal (What's new in Plane) (#5690)
* [WEB-2500] feat: Product updates modal (What's new in Plane)

* fix: build errors.

* fix: lint errors resolved.

* chore: minor improvements.

* chore: minor fixes
2024-10-29 19:26:00 +05:30
Aaryan Khandelwal
c423d7d9df [WEB-2717] chore: implemented issue attachment upload progress (#5901)
* chore: added attachment upload progress

* chore: add debounce while updating the upload status

* chore: update percentage calc logic

* chore: update debounce interval
2024-10-29 19:22:29 +05:30
rahulramesha
538e78f135 refactor timeline store for code splitting (#5926) 2024-10-29 17:57:45 +05:30
Aaryan Khandelwal
b4bbe3a8ba fix: change html tag name for callout (#5924) 2024-10-29 14:12:12 +05:30
Prateek Shourya
b67f352b90 fix: lint and build errors (#5923)
* fix: lint errors.

* fix: build errors
2024-10-29 13:45:18 +05:30
Anmol Singh Bhatia
8829575780 chore: app sidebar add issue button improvement (#5921) 2024-10-29 13:42:42 +05:30
rahulramesha
724adeff5c [WEB-2442] fix: Timeline Improvements and bug fixes (#5922)
* improve auto scroller logic

* fix drag indicator visibility on for blocks

* modify timeline store logic and improve timeline scrolling logic

* fix width of block while dragging with left handle

* fix block arrow direction while block is out of viewport
2024-10-29 13:42:14 +05:30
rahulramesha
a88a39fb1e [WEB-2442] feat: Revamp Timeline Layout (#5915)
* chore: added issue relations in issue listing

* chore: added pagination for issue detail endpoint

* chore: bulk date update endpoint

* chore: appended the target date

* chore: issue relation new types defined

* fix: order by and issue filters

* fix: passed order by in pagination

* chore: changed the key for issue dates

* Revamp Timeline Layout

* fix block dragging

* minor ui fixes

* improve auto scroll UX

* remove unused import

* fix timeline layout heights

* modify base timeline store

* Segregate issue relation types

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-28 18:03:31 +05:30
Aaryan Khandelwal
f986bd83fd fix: callout content not being saved in description html (#5920) 2024-10-28 17:00:32 +05:30
Satish Gandham
6113aefde0 Fix issue with SQLite transactions (#5919) 2024-10-28 14:14:57 +05:30
Bavisetti Narayan
6d08cf2757 fix: rendered the analytics for labels (#5906)
* fix: rendered the analytics for labels

* fix: analytics exports
2024-10-24 20:35:27 +05:30
Bavisetti Narayan
2caf23fb71 fix: background task metadata (#5909) 2024-10-24 20:35:05 +05:30
Bavisetti Narayan
b33328dec5 fix: issue retrieval endpoint (#5907) 2024-10-24 20:33:16 +05:30
Aaryan Khandelwal
14b31e3fcd [PULSE-36] feat: callout component for pages and issue descriptions (#5856)
* feat: editor callouts

* chore: backspace action updated

* chore: update callout attributes types

* chore: revert emoji picker changes

* chore: removed class atrribute

* chore: added sanitization for local storage values

* chore: disable emoji picker search
2024-10-24 15:36:38 +05:30
Satish Gandham
9fb353ef54 [WEB-2706] chore: Switch to wa-sqlite (#5859)
* fix layout switching when filter is not yet completely fetched

* add layout in issue filter params

* Handle cases when DB intilization failed

* chore: permission layer and updated issues v1 query from workspace to project level

* - Switch to using wa-sqlite instead of sqlite-wasm

* Code cleanup and fix indexes

* Add missing files

* - Import only required functions from sentry
- Wait till all the tables are created

* Skip workspace sync if one is already in progress.

* Sync workspace without using transaction

* Minor cleanup

* Close DB connection before deleting files
Fix clear OPFS on safari

* Fix type issue

* Improve issue insert performance

* Refactor workspace sync

* Close the DB connection while switching workspaces

* Update web/core/local-db/worker/db.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Worker cleanup and error handling

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update web/core/local-db/worker/db.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update web/core/local-db/storage.sqlite.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update web/core/local-db/worker/db.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Code cleanup

* Set default order by to created at and descending

* Wait for transactions to complete.

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2024-10-24 15:35:02 +05:30
Ketan Sharma
ad25a972a1 [WEB-2587] fix: hide log work button for guest user (#5787)
* fix the rendering logic

* fix handle nullish value
2024-10-24 14:48:59 +05:30
Ketan Sharma
4157f3750b add missing background color (#5789) 2024-10-24 14:46:56 +05:30
Ketan Sharma
d7c5645948 [WEB-2606] fix: project members shouldn't be able to change others roles (#5802)
* [WEB-2606] fix: project members should not be able to change other project member's roles

* add better logic
2024-10-24 14:46:10 +05:30
Anmol Singh Bhatia
8d837eddb3 chore: calendar current date indicator improvement (#5880) 2024-10-24 14:42:44 +05:30
Anmol Singh Bhatia
0312455d66 fix: project state setting dnd (#5881) 2024-10-24 14:41:35 +05:30
Prateek Shourya
e4e83a947a [WEB-2479] fix: merge default and archived issue details endpoint. (#5882) 2024-10-24 14:40:50 +05:30
Akshita Goyal
2ecc379486 fix: truncated project name in analytics dropdown (#5883) 2024-10-24 14:39:32 +05:30
Prateek Shourya
bf220666dd [WEB-2326] fix: issue activity mutation on attachments upload. (#5886) 2024-10-24 14:36:30 +05:30
Anmol Singh Bhatia
074ad6d1a4 chore: intake issue back date snooze disabled (#5888) 2024-10-24 14:35:57 +05:30
Bavisetti Narayan
4b815f3769 fix: issue attachment uploads (#5904) 2024-10-23 21:04:10 +05:30
Anmol Singh Bhatia
56bb6e1f48 fix: draft issue type update outside click (#5902) 2024-10-23 20:11:28 +05:30
Bavisetti Narayan
5afa686a21 chore: issue attachment deletion (#5903) 2024-10-23 20:11:01 +05:30
Anmol Singh Bhatia
25a410719b fix: intake issue description and navigation (#5900) 2024-10-23 16:46:28 +05:30
Anmol Singh Bhatia
cbfcbba5d1 [WEB-2709] chore: intake issue navigation improvement (#5891)
* chore: intake issue navigation improvement

* chore: code refactor

* chore: intake issue navigation improvement

* chore: intake issue navigation improvement
2024-10-23 15:19:43 +05:30
Anmol Singh Bhatia
c4421f5f97 fix: issue widget modal rendering (#5896) 2024-10-23 15:19:26 +05:30
Anmol Singh Bhatia
84c06c4713 fix: guest user intake issue edit validation (#5898) 2024-10-23 15:19:10 +05:30
Bavisetti Narayan
6df98099f5 chore: filter the deleted issues stats (#5893) 2024-10-22 20:51:11 +05:30
Bavisetti Narayan
295f094916 chore: changed the annotate for cycle id (#5892) 2024-10-22 19:02:05 +05:30
Akshita Goyal
d859ab9c39 [WEB-2708] fix: intake module and cycle addition fixed (#5890)
* fix: intake module and cycle addition fixed

* chore: fixed the search endpoint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-22 17:59:07 +05:30
Anmol Singh Bhatia
36b868e375 [WEB-2707] fix: draft issue module update and code refactor (#5889)
* chore: draft issue module update

* chore: code refactor
2024-10-22 16:16:29 +05:30
Aaryan Khandelwal
4c20be6cf2 [PE-68] fix: markdown transformation of mention and custom image components (#5864)
* fix: markdown content for mention and custom image extensions

* style: update issue embed upgrade card

* chore: added string escapes
2024-10-22 14:29:50 +05:30
Bavisetti Narayan
7bf4620bc1 chore: soft deletion of cycle and module (#5884)
* chore: soft deletion of cycle and module

* chore: cycle module soft delete

* chore: added the deletion task

* chore: updated the env example

* chore: cycle issue unique constraints

* chore: udpated the Q operator
2024-10-22 14:21:26 +05:30
Nikhil
00eff43f4d fix: bucket policy script to handle error conditions (#5887)
* fix: bucket policy script to handle error conditions

* dev: handle edge cases
2024-10-22 14:19:43 +05:30
sriram veeraghanta
3d3f1b8f74 fix: typescript version consistency 2024-10-22 14:13:28 +05:30
sriram veeraghanta
b87516b0be chore: fixing inconsistent dependencies across the platform (#5885)
* chore: fixing inconsistent dependies across the platform

* fix: fixing peer dependencies

* chore: yarn lock regeneration
2024-10-22 14:03:34 +05:30
Anmol Singh Bhatia
8a1d3c4cf9 chore: urgent priority icon improvement (#5879) 2024-10-22 13:25:22 +05:30
Akshita Goyal
0f25f39404 WEB-2381 Chore: intake refactor (#5752)
* chore: intake emails and forms

* fix: moved files to ee

* fix: intake form ui

* fix: settings apis integrated

* fix: removed publish api

* fix: removed space app

* fix: lint issue

* fix: removed logs

* fix: removed comment

* fix: improved success image
2024-10-22 12:09:03 +05:30
sriram veeraghanta
fb49644185 fix: renaming the action and formatting 2024-10-21 19:26:16 +05:30
Nikhil
b745a29454 fix: credential sending for file uploads (#5869) 2024-10-21 17:46:46 +05:30
M. Palanikannan
c940a2921e fix: validation of public and private assets (#5878) 2024-10-21 15:59:44 +05:30
Anmol Singh Bhatia
6f8df3279c [WEB-2681] fix: module progress indicator (#5842)
* fix: module progress indicator

* fix: module progress indicator
2024-10-21 15:48:35 +05:30
Prateek Shourya
b833e3b10c [WEB-2674] chore: open parent issues in peek-overview from the parent badge. (#5872)
* [WEB-2674] chore: open parent issues in peek-overview from the parent badge.

* chore: remove `_blank` target from ControlLink.
2024-10-21 14:20:00 +05:30
M. Palanikannan
5a0dc4a65a [PE-69] fix: image restoration fixed for new images in private bucket (#5839)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity

* fix: restore only on error

* fix: restore image by handling it inside the image component

* fix: image restoration fixed and aspect ratio added to old images to stop updates on load

* fix: added back restoring logic for public images

* fix: add conditions

* fix: image attributes types

* fix: return for old images

* fix: remove passive false

* fix: eslint fixes

* fix: stopping infinite loading scenarios while restoring from error
2024-10-21 14:17:05 +05:30
Ketan Sharma
e866571e04 fix backend (#5875) 2024-10-21 13:07:36 +05:30
Bavisetti Narayan
3c3fc7cd6d chore: draft issue listing (#5874) 2024-10-21 13:02:20 +05:30
Bavisetti Narayan
db919420a7 [WEB-2693] chore: removed the deleted cycles from the issue list (#5868)
* chore: added the deleted cycles from list

* chore: removed the extra annotation

* chore: removed the frontend comment
2024-10-18 15:48:34 +05:30
M. Palanikannan
2982cd47a9 fix: remoteImageSrc to come from resolved source (#5867) 2024-10-18 14:21:07 +05:30
M. Palanikannan
81550ab5ef [PE-56] regression: image aspect ratio fix (#5792)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity
2024-10-18 13:40:39 +05:30
Bavisetti Narayan
07402efd79 chore: filtered the deleted labels and modules (#5860) 2024-10-18 13:20:32 +05:30
Prateek Shourya
46302f41bc fix: improvements for project types. (#5857) 2024-10-18 11:08:07 +05:30
Ketan Sharma
9530884c59 fix the logic (#5807) 2024-10-17 17:08:49 +05:30
Prateek Shourya
173b49b4cb [WEB-2431] chore: profile settings page UI improvement (#5838)
* [WEB-2431] chore: timezone and language management.

* chore: remove project level timezone changes.

* chore: minor UI improvement.

* chore: minor improvements
2024-10-17 17:06:22 +05:30
Anmol Singh Bhatia
e581ac890e chore: workspace collaborators improvements (#5846) 2024-10-17 17:05:21 +05:30
Anmol Singh Bhatia
a7b58e4a93 [WEB-2625] chore: workspace favorite and draft improvement (#5855)
* chore: favorite empty state updated

* chore: added draft issue count in workspace members

* chore: workspace draft count improvement

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-17 17:02:25 +05:30
Bavisetti Narayan
d552913171 chore: updated queryset for soft delete (#5844) 2024-10-17 17:01:26 +05:30
Bavisetti Narayan
b6a7e45e8d chore: added draft cycle and module in draft issue (#5854) 2024-10-17 13:35:13 +05:30
Aaryan Khandelwal
6209aeec0b fix: color extension not working on issue description and published page (#5852)
* fix: color extension not working

* chore: update types
2024-10-17 13:26:23 +05:30
Anmol Singh Bhatia
1099c59b83 fix: draft issue empty state flicker (#5848) 2024-10-17 12:55:32 +05:30
Nikhil
9b2ffaaca8 fix: draft issue asset conversion to issue (#5849) 2024-10-17 12:51:13 +05:30
sriram veeraghanta
aa93cca7bf fix: workflow fixes 2024-10-16 21:07:01 +05:30
sriram veeraghanta
1191f74bfe fix: workflow fixes 2024-10-16 20:08:25 +05:30
sriram veeraghanta
fbd1f6334a fix: workflow fixes 2024-10-16 20:05:10 +05:30
Anmol Singh Bhatia
7d36d63eb1 [WEB-2682] fix: delete project mutation and workspace draft header validation (#5843)
* fix: workspace draft header action validation

* fix: delete project mutation
2024-10-16 16:13:26 +05:30
Nikhil
9b85306359 dev: move storage metadata collection to background job (#5818)
* fix: move storage metadata collection to background job

* fix: docker compose and env

* fix: archive endpoint
2024-10-16 13:55:49 +05:30
guru_sainath
cc613e57c9 chore: delete deprecated tables (#5833)
* migration: external source and id for issues

* fix: cleaning up deprecated favorite tables

* fix: removing deprecated models

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-10-16 00:33:57 +05:30
Bavisetti Narayan
6e63af7ca9 [WEB-2626] chore: removed the deleted issue's count (#5837)
* chore: removed the deleted issue count

* chore: added issue manager in burn down
2024-10-16 00:30:44 +05:30
guru_sainath
5f9af92faf fix: attachment_count in issue pagination v2 endpoint (#5840)
* fix: attachemnt_count in the issue pagination v2 endpoint

* fix: string comparision in description check in params
2024-10-15 23:46:57 +05:30
Anmol Singh Bhatia
4e70e894f6 chore: workspace draft issue type (#5836) 2024-10-15 18:59:22 +05:30
Anmol Singh Bhatia
ff090ecf39 fix: workspace draft move to project (#5834) 2024-10-15 17:14:56 +05:30
Akshita Goyal
645a261493 fix: Added a common dropdown component (#5826)
* fix: Added a common dropdown component

* fix: dropdown

* fix: estimate dropdown

* fix: removed consoles
2024-10-15 15:17:46 +05:30
Prateek Shourya
8d0611b2a7 [WEB-2613] chore: open parent and sibling issue in new tab from peek-overview/ issue detail page. (#5819) 2024-10-15 13:37:52 +05:30
Bavisetti Narayan
3d7d3c8af1 [WEB-2631] chore: changed the cascading logic for soft delete (#5829)
* chore: changed the cascading logic for soft delete

* chore: changed the delete key

* chore: added the key on delete in project base model
2024-10-15 13:30:44 +05:30
Prateek Shourya
662b99da92 [WEB-2577] improvement: use common create/update issue modal for accepting intake issues for consistency (#5830)
* [WEB-2577] improvement: use common create/update issue modal for accepting intake issues for consistency

* fix: lint errors.

* chore: minor UX copy fix.

* chore: minor indentation fix.
2024-10-15 13:11:14 +05:30
Prateek Shourya
fa25a816a7 [WEB-2549] chore: ux copy update for project access. (#5831) 2024-10-15 12:57:29 +05:30
Anmol Singh Bhatia
ee823d215e [WEB-2629] chore: workspace draft issue ux copy updated (#5825)
* chore: workspace draft issue ux copy updated

* chore: workspace draft issue ux copy updated
2024-10-14 17:26:54 +05:30
Akshita Goyal
4b450f8173 fix: moved dropdowns to chart component + added pending icon (#5824)
* fix: moved dropdowns to chart component + added pending icon

* fix: copy changes

* fix: review changes
2024-10-14 17:00:58 +05:30
Anmol Singh Bhatia
36229d92e0 [WEB-2629] fix: workspace draft delete and move mutation (#5822)
* fix: mutation fix

* chore: code refactor

* chore: code refactor

* chore: useWorkspaceIssueProperties added
2024-10-14 16:50:19 +05:30
Anmol Singh Bhatia
cb90810d02 chore: double click action added and code refactor (#5821) 2024-10-14 16:46:08 +05:30
Anmol Singh Bhatia
658542cc62 [WEB-2616] fix: issue widget attachment (#5820)
* fix: issue widget attachment

* chore: comment added
2024-10-14 16:32:31 +05:30
Nikhil
701af734cd fix: export for analytics and csv (#5815) 2024-10-13 02:11:32 +05:30
Nikhil
cf53cdf6ba fix: analytics tab for private bucket (#5814) 2024-10-13 01:27:48 +05:30
Nikhil
6490ace7c7 fix: intake issue (#5813) 2024-10-13 00:44:52 +05:30
Nikhil
0ac406e8c7 fix: private bucket (#5812)
* fix: workspace level issue creation

* dev: add draft issue support, fix your work tab and cache invalidation for workspace level logos

* chore: issue description

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-10-13 00:31:28 +05:30
Aaryan Khandelwal
e404450e1a [WEB-310] regression: generate file url function (#5811)
* fix: generate file url function

* chore: remove unused imports

* chore: replace indexOf logix with startsWith
2024-10-12 23:39:50 +05:30
sriram veeraghanta
7cc86ad4c0 chore: removing unused packages 2024-10-12 01:43:22 +05:30
Anmol Singh Bhatia
3acc9ec133 fix: intake exception error (#5810) 2024-10-11 22:01:39 +05:30
Anmol Singh Bhatia
286ab7f650 fix: workspace draft issues count (#5809) 2024-10-11 21:28:05 +05:30
Aaryan Khandelwal
7e334203f1 [WEB-310] dev: private bucket implementation (#5793)
* chore: migrations and backmigration to move attachments to file asset

* chore: move attachments to file assets

* chore: update migration file to include created by and updated by and size

* chore: remove uninmport errors

* chore: make size as float field

* fix: file asset uploads

* chore: asset uploads migration changes

* chore: v2 assets endpoint

* chore: remove unused imports

* chore: issue attachments

* chore: issue attachments

* chore: workspace logo endpoints

* chore: private bucket changes

* chore: user asset endpoint

* chore: add logo_url validation

* chore: cover image urlk

* chore: change asset max length

* chore: pages endpoint

* chore: store the storage_metadata only when none

* chore: attachment asset apis

* chore: update create private bucket

* chore: make bucket private

* chore: fix response of user uploads

* fix: response of user uploads

* fix: job to fix file asset uploads

* fix: user asset endpoints

* chore: avatar for user profile

* chore: external apis user url endpoint

* chore: upload workspace and user asset actions updated

* chore: analytics endpoint

* fix: analytics export

* chore: avatar urls

* chore: update user avatar instances

* chore: avatar urls for assignees and creators

* chore: bucket permission script

* fix: all user avatr instances in the web app

* chore: update project cover image logic

* fix: issue attachment endpoint

* chore: patch endpoint for issue attachment

* chore: attachments

* chore: change attachment storage class

* chore: update issue attachment endpoints

* fix: issue attachment

* chore: update issue attachment implementation

* chore: page asset endpoints

* fix: web build errors

* chore: attachments

* chore: page asset urls

* chore: comment and issue asset endpoints

* chore: asset endpoints

* chore: attachment endpoints

* chore: bulk asset endpoint

* chore: restore endpoint

* chore: project assets endpoints

* chore: asset url

* chore: add delete asset endpoints

* chore: fix asset upload endpoint

* chore: update patch endpoints

* chore: update patch endpoint

* chore: update editor image handling

* chore: asset restore endpoints

* chore: avatar url for space assets

* chore: space app assets migration

* fix: space app urls

* chore: space endpoints

* fix: old editor images rendering logic

* fix: issue archive and attachment activity

* chore: asset deletes

* chore: attachment delete

* fix: issue attachment

* fix: issue attachment get

* chore: cover image url for projects

* chore: remove duplicate py file

* fix: url check function

* chore: chore project cover asset delete

* fix: migrations

* chore: delete migration files

* chore: update bucket

* fix: build errors

* chore: add asset url in intake attachment

* chore: project cover fix

* chore: update next.config

* chore: delete old workspace logos

* chore: workspace assets

* chore: asset get for space

* chore: update project modal

* chore: remove unused imports

* fix: space app editor helper

* chore: update rich-text read-only editor

* chore: create multiple column for entity identifiers

* chore: update migrations

* chore: remove entity identifier

* fix: issue assets

* chore: update maximum file size logic

* chore: update editor max file size logic

* fix: close modal after removing workspace logo

* chore: update uploaded asstes' status post issue creation

* chore: added file size limit to the space app

* dev: add file size limit restriction on all endpoints

* fix: remove old workspace logo and user avatar

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-10-11 20:13:38 +05:30
Anmol Singh Bhatia
c9580ab794 chore workspace draft issue improvements (#5808) 2024-10-11 19:51:38 +05:30
Aaryan Khandelwal
e7065af358 [WEB-2494] dev: custom text color and background color extensions (#5786)
* dev: created custom text color and background color extensions

* chore: update slash commands icon style

* chore: update constants

* chore: update variables css file selectors
2024-10-11 19:11:39 +05:30
Manish Gupta
74695e561a modified the action name (#5806) 2024-10-11 18:05:53 +05:30
Anmol Singh Bhatia
c9dbd1d5d1 [WEB-2388] chore: theme changes and workspace draft issue total count updated (#5805)
* chore: theme changes and total count updated

* chore: code refactor
2024-10-11 17:57:48 +05:30
Manish Gupta
6200890693 fix: updated branch build action with BUILD/RELEASE options (#5803) 2024-10-11 17:25:25 +05:30
guru_sainath
3011ef9da1 build-error: removed store prop from calendar store (#5801) 2024-10-11 15:53:58 +05:30
Anmol Singh Bhatia
bf7b3229d1 [WEB-2388] fix: workspace draft issues (#5800)
* fix: create issue modal handle close

* fix: workspace level draft issue store update

* chore: count added

* chore: added description html in list endpoint

* fix: workspace draft issue mutation

* fix: workspace draft issue empty state and count

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-10-11 15:23:32 +05:30
rahulramesha
2c96e042c6 fix workspace drafts build (#5798) 2024-10-10 22:59:27 +05:30
rahulramesha
9c2278a810 fix workspace draft build (#5795) 2024-10-10 20:50:43 +05:30
Anmol Singh Bhatia
332d2d5c68 [WEB-2388] dev: workspace draft issues (#5772)
* chore: workspace draft page added

* chore: workspace draft issues services added

* chore: workspace draft issue store added

* chore: workspace draft issue filter store added

* chore: issue rendering

* conflicts: resolved merge conflicts

* conflicts: handled draft issue store

* chore: draft issue modal

* chore: code optimisation

* chore: ui changes

* chore: workspace draft store and modal updated

* chore: workspace draft issue component added

* chore: updated store and workflow in draft issues

* chore: updated issue draft store

* chore: updated issue type cleanup in components

* chore: code refactor

* fix: build error

* fix: quick actions

* fix: update mutation

* fix: create update modal

* chore: commented project draft issue code

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 19:12:34 +05:30
guru_sainath
e9158f820f [WEB-2615] fix: module date validation during chart distribution generation (#5791)
* fix: module date validation while generating the chart distribution

* chore: indentation fix

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 18:33:59 +05:30
sriram veeraghanta
1e1733f6db Merge branch 'master' of github.com:makeplane/plane into preview 2024-10-10 17:24:47 +05:30
Bavisetti Narayan
5573d85d80 chore: only admin's can delete a project (#5790) 2024-10-10 17:24:18 +05:30
sriram veeraghanta
c1f881b2d1 Merge branch 'develop' of github.com:makeplane/plane into preview 2024-10-10 15:11:33 +05:30
sriram veeraghanta
9bab108329 Merge pull request #5788 from makeplane/preview
release: v0.23.1
2024-10-10 15:11:04 +05:30
sriram veeraghanta
5f4875cc60 fix: version bump 2024-10-10 15:05:03 +05:30
sriram veeraghanta
0c1c6dee99 fix: adding scheduled tracing 2024-10-10 14:57:42 +05:30
sriram veeraghanta
1639f34db0 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-10 14:07:25 +05:30
Bavisetti Narayan
8a866e440c chore: only admin can changed the project settings (#5766) 2024-10-10 14:06:14 +05:30
Prateek Shourya
7495a7d0cb [WEB-2605] fix: update URL regex pattern to allow complex links. (#5767) 2024-10-10 14:06:14 +05:30
M. Palanikannan
2b1da96c3f fix: drag handle scrolling fixed (#5619)
* fix: drag handle scrolling fixed

* fix: closest scrollable parent found and scrolled

* fix: removed overflow auto from framerenderer

* fix: make dragging dynamic and smoother
2024-10-10 14:06:14 +05:30
Aaryan Khandelwal
daa06f1831 [WEB-2532] fix: custom theme mutation logic (#5685)
* fix: custom theme mutation logic

* chore: update querySelector element
2024-10-10 14:06:14 +05:30
M. Palanikannan
b97fcfb46d fix: show the full screen toolbar in read only instances as well (#5746) 2024-10-10 14:06:14 +05:30
M. Palanikannan
852fc9bac1 [WEB-2603] fix: remove validation of roles from the live server (#5761)
* fix: remove validation of roles from the live server

* chore: remove the service

* fix: remove all validation of authorization

* fix: props updated
2024-10-10 14:06:14 +05:30
Akshita Goyal
55f44e0245 fix: spreadsheet flicker issue (#5769) 2024-10-10 14:06:14 +05:30
Prateek Shourya
8981e52dcc [WEB-2601] improvement: add click to copy issue identifier on peek-overview and issue detail page. (#5760) 2024-10-10 14:06:14 +05:30
Akshita Goyal
d92dbaea72 [WEB-2589] Chore: inbox issue permissions (#5763)
* chore: changed permission in inbox issue

* chore: fixed permissions for intake

* fix: refactoring

* fix: lint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 14:06:14 +05:30
dependabot[bot]
58f3d0a68c chore(deps): bump django in /apiserver/requirements (#5781)
Bumps [django](https://github.com/django/django) from 4.2.15 to 4.2.16.
- [Commits](https://github.com/django/django/compare/4.2.15...4.2.16)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-10 14:06:14 +05:30
Akshita Goyal
45880b3a72 [WEB-2589] Chore: inbox issue permissions (#5763)
* chore: changed permission in inbox issue

* chore: fixed permissions for intake

* fix: refactoring

* fix: lint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-09 17:48:52 +05:30
dependabot[bot]
992adb9794 chore(deps): bump django in /apiserver/requirements (#5781)
Bumps [django](https://github.com/django/django) from 4.2.15 to 4.2.16.
- [Commits](https://github.com/django/django/compare/4.2.15...4.2.16)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-09 17:26:33 +05:30
Akshita Goyal
6d78418e79 fix: create cycle function (#5775)
* fix: create cycle function

* chore: draft and cycle version changes

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-08 20:01:15 +05:30
Prateek Shourya
6e52f1b434 [WEB-2601] improvement: add click to copy issue identifier on peek-overview and issue detail page. (#5760) 2024-10-08 18:43:13 +05:30
Aaryan Khandelwal
c3c1ea727d [WEB-2494] feat: text color and highlight options for all editors (#5653)
* feat: add text color and highlight options to pages

* style: rich text editor floating toolbar

* chore: remove unused function

* refactor: slash command components

* chore: move default text and background options to the top

* fix: sections filtering logic
2024-10-08 18:42:47 +05:30
Aaryan Khandelwal
5afc576dec refactor: export components (#5773) 2024-10-08 18:41:08 +05:30
Ketan Sharma
50ae32f3e1 [WEB-2555] fix: add "mark all as read" in the notifications header (#5770)
* move mark all as read to header and remove it from dropdown

* made recommended changes
2024-10-08 17:13:35 +05:30
Akshita Goyal
0451593057 fix: spreadsheet flicker issue (#5769) 2024-10-08 17:10:16 +05:30
M. Palanikannan
be092ac99f [WEB-2603] fix: remove validation of roles from the live server (#5761)
* fix: remove validation of roles from the live server

* chore: remove the service

* fix: remove all validation of authorization

* fix: props updated
2024-10-08 16:55:26 +05:30
Anmol Singh Bhatia
f73a603226 [WEB-2380] chore: cycle sidebar refactor (#5759)
* chore: cycle sidebar refactor

* chore: code splitting

* chore: code refactor

* chore: code refactor
2024-10-08 16:54:44 +05:30
Aaryan Khandelwal
b27249486a [PE-45] feat: page export as PDF & Markdown (#5705)
* feat: export page as pdf and markdown

* chore: add image conversion logic
2024-10-08 16:54:02 +05:30
Anmol Singh Bhatia
20c9e232e7 chore: IssueParentDetail added to issue peekoverview (#5751) 2024-10-08 16:53:07 +05:30
Bavisetti Narayan
d168fd4bfa [WEB-2388] fix: workspace draft issues migration (#5749)
* fix: workspace draft issues

* chore: changed the timezone key

* chore: migration changes
2024-10-08 16:51:57 +05:30
M. Palanikannan
7317975b04 fix: show the full screen toolbar in read only instances as well (#5746) 2024-10-08 16:50:32 +05:30
Aaryan Khandelwal
39195d0d89 [WEB-2532] fix: custom theme mutation logic (#5685)
* fix: custom theme mutation logic

* chore: update querySelector element
2024-10-08 16:47:16 +05:30
Mihir
6bf0e27b66 [WEB-2433] chore-Update name of the Layout (#5661)
* Updated layout names

* Corrected character casing for titles
2024-10-08 16:44:50 +05:30
M. Palanikannan
5fb7e98b7c fix: drag handle scrolling fixed (#5619)
* fix: drag handle scrolling fixed

* fix: closest scrollable parent found and scrolled

* fix: removed overflow auto from framerenderer

* fix: make dragging dynamic and smoother
2024-10-08 16:44:05 +05:30
Prateek Shourya
328b6961a2 [WEB-2605] fix: update URL regex pattern to allow complex links. (#5767) 2024-10-08 13:20:27 +05:30
Bavisetti Narayan
39eabc28b5 chore: only admin can changed the project settings (#5766) 2024-10-07 20:07:24 +05:30
sriram veeraghanta
d97ca68229 Merge pull request #5764 from makeplane/preview
release: v0.23.0
2024-10-07 18:54:49 +05:30
Bavisetti Narayan
c92fe6191e [WEB-2600] fix: estimate point deletion (#5762)
* chore: only delete the cascade fields

* chore: logged the issue activity
2024-10-07 17:23:37 +05:30
pablohashescobar
7bb04003ea fix: instance trace 2024-10-07 15:56:27 +05:30
sriram veeraghanta
19dab1fad0 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-07 13:20:07 +05:30
M. Palanikannan
5f7b6ecf7f fix: image deletion on submit fixed in comments (#5748)
* fix: image deletion on submit fixed in comments

* fix: cleareditor added to read only editor

* fix: image component double drop fixed

* feat: multiple image selection and uploading

* fix: click event on read only instance

* fix: made things async

* fix: prevented default behaviour

* fix: removed extra dep and cleaned up logic
2024-10-07 13:12:16 +05:30
guru_sainath
dfd3af13cf fix: handled favorite entity data null (#5756) 2024-10-07 12:57:15 +05:30
pablohashescobar
4cc1b79d81 chore: instance tracing 2024-10-04 21:35:13 +05:30
sriram veeraghanta
4a6f646317 fix: lockfile update 2024-10-04 19:38:19 +05:30
sriram veeraghanta
b8e21d92bf Merge branch 'preview' of github.com:makeplane/plane into preview 2024-10-04 19:26:06 +05:30
sriram veeraghanta
b87d5c5be6 fix: version upgrade 2024-10-04 19:25:49 +05:30
ach5948
ceda06e88d fix: Remove typo from Contributing doc (#5736) 2024-10-04 19:24:47 +05:30
sriram veeraghanta
eb344881c2 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-04 19:22:26 +05:30
Satish Gandham
01257a6936 chore: permission layer and updated issues v1 query from workspace to project level (#5753)
Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-10-04 18:34:46 +05:30
Prateek Shourya
51b01ebcac [WEB-2580] chore: improvements for custom search select. (#5744)
* [WEB-2580] chore: improvements for custom search select.

* chore: update optionTooltip prop.

* chore: update option tooltip prop.

* chore: minor updates.
2024-10-04 17:31:09 +05:30
sriram veeraghanta
0a8d66dcc3 fix: trace information setup 2024-10-04 16:40:33 +05:30
sriram veeraghanta
a5e3e4fe7d fix: api tracing 2024-10-04 01:14:29 +05:30
sriram veeraghanta
707570ca7a Merge pull request #5041 from makeplane/preview
release: v0.22-dev
2024-07-05 13:28:45 +05:30
sriram veeraghanta
c76af7d7d6 Merge pull request #4688 from makeplane/preview
release: v0.21-dev
2024-06-03 18:54:06 +05:30
sriram veeraghanta
1dcea9bcc8 Merge pull request #4569 from makeplane/preview
release: v0.20-dev
2024-05-23 19:55:06 +05:30
sriram veeraghanta
da957e06b6 Merge pull request #4349 from makeplane/preview
release: v0.19-dev
2024-05-03 20:36:07 +05:30
sriram veeraghanta
a0b9596cb4 Merge pull request #4239 from makeplane/preview
chore:version update
2024-04-19 12:01:15 +05:30
sriram veeraghanta
f71e8a3a0f Merge pull request #4238 from makeplane/preview
release: v0.18-dev
2024-04-19 11:56:03 +05:30
sriram veeraghanta
002fb4547b Merge pull request #4107 from makeplane/preview
release: v0.17-dev
2024-04-02 20:07:48 +05:30
sriram veeraghanta
c1b1ba35c1 Merge pull request #3878 from makeplane/preview
release: v0.16-dev
2024-03-05 20:04:08 +05:30
sriram veeraghanta
4566d6e80c Merge pull request #3697 from makeplane/preview
release: 0.15.4-dev
2024-02-19 19:30:06 +05:30
sriram veeraghanta
e8d359e625 Merge pull request #3674 from makeplane/preview
fix: build branch docker images push on release
2024-02-15 14:35:32 +05:30
sriram veeraghanta
351eba8d61 Merge pull request #3671 from makeplane/preview
release: peek overview issue description initial load bug (#3670)
2024-02-15 03:25:30 +05:30
sriram veeraghanta
1e27e37b51 Merge pull request #3666 from makeplane/preview
release: v0.15.2-dev
2024-02-14 19:41:55 +05:30
sriram veeraghanta
7df2e9cf11 Merge pull request #3632 from makeplane/preview
release: v0.15.1-dev
2024-02-12 20:59:56 +05:30
sriram veeraghanta
c6e3f1b932 Merge pull request #3535 from makeplane/preview
release: 0.15-dev
2024-02-01 15:01:49 +05:30
775 changed files with 32485 additions and 13109 deletions

126
.github/actions/build-push-ce/action.yml vendored Normal file
View File

@@ -0,0 +1,126 @@
name: "Build and Push Docker Image"
description: "Reusable action for building and pushing Docker images"
inputs:
docker-username:
description: "The Dockerhub username"
required: true
docker-token:
description: "The Dockerhub Token"
required: true
# Docker Image Options
docker-image-owner:
description: "The owner of the Docker image"
required: true
docker-image-name:
description: "The name of the Docker image"
required: true
build-context:
description: "The build context"
required: true
default: "."
dockerfile-path:
description: "The path to the Dockerfile"
required: true
build-args:
description: "The build arguments"
required: false
default: ""
# Buildx Options
buildx-driver:
description: "Buildx driver"
required: true
default: "docker-container"
buildx-version:
description: "Buildx version"
required: true
default: "latest"
buildx-platforms:
description: "Buildx platforms"
required: true
default: "linux/amd64"
buildx-endpoint:
description: "Buildx endpoint"
required: true
default: "default"
# Release Build Options
build-release:
description: "Flag to publish release"
required: false
default: "false"
build-prerelease:
description: "Flag to publish prerelease"
required: false
default: "false"
release-version:
description: "The release version"
required: false
default: "latest"
runs:
using: "composite"
steps:
- name: Set Docker Tag
shell: bash
env:
IMG_OWNER: ${{ inputs.docker-image-owner }}
IMG_NAME: ${{ inputs.docker-image-name }}
BUILD_RELEASE: ${{ inputs.build-release }}
IS_PRERELEASE: ${{ inputs.build-prerelease }}
REL_VERSION: ${{ inputs.release-version }}
run: |
FLAT_BRANCH_VERSION=$(echo "${{ github.ref_name }}" | sed 's/[^a-zA-Z0-9.-]//g')
if [ "${{ env.BUILD_RELEASE }}" == "true" ]; then
semver_regex="^v([0-9]+)\.([0-9]+)\.([0-9]+)(-[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*)?$"
if [[ ! ${{ env.REL_VERSION }} =~ $semver_regex ]]; then
echo "Invalid Release Version Format : ${{ env.REL_VERSION }}"
echo "Please provide a valid SemVer version"
echo "e.g. v1.2.3 or v1.2.3-alpha-1"
echo "Exiting the build process"
exit 1 # Exit with status 1 to fail the step
fi
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:${{ env.REL_VERSION }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:latest
else
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:${FLAT_BRANCH_VERSION}
fi
echo "DOCKER_TAGS=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ inputs.docker-username }}
password: ${{ inputs.docker-token}}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ inputs.buildx-driver }}
version: ${{ inputs.buildx-version }}
endpoint: ${{ inputs.buildx-endpoint }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Docker Image
uses: docker/build-push-action@v5.1.0
with:
context: ${{ inputs.build-context }}
file: ${{ inputs.dockerfile-path }}
platforms: ${{ inputs.buildx-platforms }}
tags: ${{ env.DOCKER_TAGS }}
push: true
build-args: ${{ inputs.build-args }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ inputs.docker-username }}
DOCKER_PASSWORD: ${{ inputs.docker-token }}

View File

@@ -83,7 +83,7 @@ jobs:
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: ./aio
file: ./aio/Dockerfile-base-full
@@ -124,7 +124,7 @@ jobs:
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: ./aio
file: ./aio/Dockerfile-base-slim

View File

@@ -128,7 +128,7 @@ jobs:
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app
@@ -188,7 +188,7 @@ jobs:
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app

View File

@@ -1,29 +1,45 @@
name: Branch Build
name: Branch Build CE
on:
workflow_dispatch:
inputs:
build_type:
description: "Type of build to run"
required: true
type: choice
default: "Build"
options:
- "Build"
- "Release"
releaseVersion:
description: "Release Version"
type: string
default: v0.0.0
isPrerelease:
description: "Is Pre-release"
type: boolean
default: false
required: true
arm64:
description: "Build for ARM64 architecture"
required: false
default: false
type: boolean
push:
branches:
- master
- preview
release:
types: [released, prereleased]
# push:
# branches:
# - master
env:
TARGET_BRANCH: ${{ github.ref_name || github.event.release.target_commitish }}
TARGET_BRANCH: ${{ github.ref_name }}
ARM64_BUILD: ${{ github.event.inputs.arm64 }}
IS_PRERELEASE: ${{ github.event.release.prerelease }}
BUILD_TYPE: ${{ github.event.inputs.build_type }}
RELEASE_VERSION: ${{ github.event.inputs.releaseVersion }}
IS_PRERELEASE: ${{ github.event.inputs.isPrerelease }}
jobs:
branch_build_setup:
name: Build Setup
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
@@ -36,13 +52,24 @@ jobs:
build_space: ${{ steps.changed_files.outputs.space_any_changed }}
build_web: ${{ steps.changed_files.outputs.web_any_changed }}
build_live: ${{ steps.changed_files.outputs.live_any_changed }}
flat_branch_name: ${{ steps.set_env_variables.outputs.FLAT_BRANCH_NAME }}
dh_img_web: ${{ steps.set_env_variables.outputs.DH_IMG_WEB }}
dh_img_space: ${{ steps.set_env_variables.outputs.DH_IMG_SPACE }}
dh_img_admin: ${{ steps.set_env_variables.outputs.DH_IMG_ADMIN }}
dh_img_live: ${{ steps.set_env_variables.outputs.DH_IMG_LIVE }}
dh_img_backend: ${{ steps.set_env_variables.outputs.DH_IMG_BACKEND }}
dh_img_proxy: ${{ steps.set_env_variables.outputs.DH_IMG_PROXY }}
build_type: ${{steps.set_env_variables.outputs.BUILD_TYPE}}
build_release: ${{ steps.set_env_variables.outputs.BUILD_RELEASE }}
build_prerelease: ${{ steps.set_env_variables.outputs.BUILD_PRERELEASE }}
release_version: ${{ steps.set_env_variables.outputs.RELEASE_VERSION }}
steps:
- id: set_env_variables
name: Set Environment Variables
run: |
if [ "${{ env.TARGET_BRANCH }}" == "master" ] || [ "${{ env.ARM64_BUILD }}" == "true" ] || ([ "${{ github.event_name }}" == "release" ] && [ "${{ env.IS_PRERELEASE }}" != "true" ]); then
if [ "${{ env.ARM64_BUILD }}" == "true" ] || ([ "${{ env.BUILD_TYPE }}" == "Release" ] && [ "${{ env.IS_PRERELEASE }}" != "true" ]); then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
@@ -53,9 +80,43 @@ jobs:
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
fi
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
flat_branch_name=$(echo ${{ env.TARGET_BRANCH }} | sed 's/[^a-zA-Z0-9\._]/-/g')
echo "FLAT_BRANCH_NAME=${flat_branch_name}" >> $GITHUB_OUTPUT
BR_NAME=$( echo "${{ env.TARGET_BRANCH }}" |sed 's/[^a-zA-Z0-9.-]//g')
echo "TARGET_BRANCH=$BR_NAME" >> $GITHUB_OUTPUT
echo "DH_IMG_WEB=plane-frontend" >> $GITHUB_OUTPUT
echo "DH_IMG_SPACE=plane-space" >> $GITHUB_OUTPUT
echo "DH_IMG_ADMIN=plane-admin" >> $GITHUB_OUTPUT
echo "DH_IMG_LIVE=plane-live" >> $GITHUB_OUTPUT
echo "DH_IMG_BACKEND=plane-backend" >> $GITHUB_OUTPUT
echo "DH_IMG_PROXY=plane-proxy" >> $GITHUB_OUTPUT
echo "BUILD_TYPE=${{env.BUILD_TYPE}}" >> $GITHUB_OUTPUT
BUILD_RELEASE=false
BUILD_PRERELEASE=false
RELVERSION="latest"
if [ "${{ env.BUILD_TYPE }}" == "Release" ]; then
FLAT_RELEASE_VERSION=$(echo "${{ env.RELEASE_VERSION }}" | sed 's/[^a-zA-Z0-9.-]//g')
echo "FLAT_RELEASE_VERSION=${FLAT_RELEASE_VERSION}" >> $GITHUB_OUTPUT
semver_regex="^v([0-9]+)\.([0-9]+)\.([0-9]+)(-[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*)?$"
if [[ ! $FLAT_RELEASE_VERSION =~ $semver_regex ]]; then
echo "Invalid Release Version Format : $FLAT_RELEASE_VERSION"
echo "Please provide a valid SemVer version"
echo "e.g. v1.2.3 or v1.2.3-alpha-1"
echo "Exiting the build process"
exit 1 # Exit with status 1 to fail the step
fi
BUILD_RELEASE=true
RELVERSION=$FLAT_RELEASE_VERSION
if [ "${{ env.IS_PRERELEASE }}" == "true" ]; then
BUILD_PRERELEASE=true
fi
fi
echo "BUILD_RELEASE=${BUILD_RELEASE}" >> $GITHUB_OUTPUT
echo "BUILD_PRERELEASE=${BUILD_PRERELEASE}" >> $GITHUB_OUTPUT
echo "RELEASE_VERSION=${RELVERSION}" >> $GITHUB_OUTPUT
- id: checkout_files
name: Checkout Files
@@ -73,24 +134,24 @@ jobs:
admin:
- admin/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
space:
- space/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
web:
- web/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
live:
- live/**
- packages/**
@@ -99,338 +160,224 @@ jobs:
- 'tsconfig.json'
- 'turbo.json'
branch_build_push_web:
if: ${{ needs.branch_build_setup.outputs.build_web == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Web Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
FRONTEND_TAG: makeplane/plane-frontend:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Frontend Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-frontend:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-frontend:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-frontend:latest
else
TAG=${{ env.FRONTEND_TAG }}
fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0
with:
context: .
file: ./web/Dockerfile.web
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.FRONTEND_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_admin:
if: ${{ needs.branch_build_setup.outputs.build_admin== 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_admin == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Admin Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
ADMIN_TAG: makeplane/plane-admin:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Admin Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-admin:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-admin:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-admin:latest
else
TAG=${{ env.ADMIN_TAG }}
fi
echo "ADMIN_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0
- name: Admin Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./admin/Dockerfile.admin
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.ADMIN_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_admin }}
build-context: .
dockerfile-path: ./admin/Dockerfile.admin
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_web:
if: ${{ needs.branch_build_setup.outputs.build_web == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Web Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Web Build and Push
uses: ./.github/actions/build-push-ce
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_web }}
build-context: .
dockerfile-path: ./web/Dockerfile.web
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_space:
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Space Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
SPACE_TAG: makeplane/plane-space:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Space Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-space:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-space:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-space:latest
else
TAG=${{ env.SPACE_TAG }}
fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Space to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Space Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./space/Dockerfile.space
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.SPACE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_apiserver:
if: ${{ needs.branch_build_setup.outputs.build_apiserver == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push API Server Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
BACKEND_TAG: makeplane/plane-backend:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Backend Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-backend:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-backend:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-backend:latest
else
TAG=${{ env.BACKEND_TAG }}
fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v5.1.0
with:
context: ./apiserver
file: ./apiserver/Dockerfile.api
platforms: ${{ env.BUILDX_PLATFORMS }}
push: true
tags: ${{ env.BACKEND_TAG }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_space }}
build-context: .
dockerfile-path: ./space/Dockerfile.space
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_live:
if: ${{ needs.branch_build_setup.outputs.build_live == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_live == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Live Collaboration Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
LIVE_TAG: makeplane/plane-live:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Live Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-live:${{ github.event.release.tag_name }}
if [ "${{ github.event.release.prerelease }}" != "true" ]; then
TAG=${TAG},makeplane/plane-live:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-live:latest
else
TAG=${{ env.LIVE_TAG }}
fi
echo "LIVE_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Live Server to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Live Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./live/Dockerfile.live
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.LIVE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_live }}
build-context: .
dockerfile-path: ./live/Dockerfile.live
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_apiserver:
if: ${{ needs.branch_build_setup.outputs.build_apiserver == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push API Server Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Backend Build and Push
uses: ./.github/actions/build-push-ce
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_backend }}
build-context: ./apiserver
dockerfile-path: ./apiserver/Dockerfile.api
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_proxy:
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Proxy Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
PROXY_TAG: makeplane/plane-proxy:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Proxy Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-proxy:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-proxy:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-proxy:latest
else
TAG=${{ env.PROXY_TAG }}
fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Proxy Build and Push
uses: ./.github/actions/build-push-ce
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_proxy }}
build-context: ./nginx
dockerfile-path: ./nginx/Dockerfile
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
attach_assets_to_build:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Build' }}
name: Attach Assets to Build
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
- name: Attach Assets
id: attach_assets
uses: actions/upload-artifact@v4
with:
context: ./nginx
file: ./nginx/Dockerfile
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.PROXY_TAG }}
push: true
name: selfhost-assets
retention-days: 2
path: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env
publish_release:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Release' }}
name: Build Release
runs-on: ubuntu-20.04
needs:
[
branch_build_setup,
branch_build_push_admin,
branch_build_push_web,
branch_build_push_space,
branch_build_push_live,
branch_build_push_apiserver,
branch_build_push_proxy,
]
env:
REL_VERSION: ${{ needs.branch_build_setup.outputs.release_version }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
- name: Create Release
id: create_release
uses: softprops/action-gh-release@v2.1.0
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
with:
tag_name: ${{ env.REL_VERSION }}
name: ${{ env.REL_VERSION }}
draft: false
prerelease: ${{ env.IS_PRERELEASE }}
generate_release_notes: true
files: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env

View File

@@ -29,11 +29,11 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@@ -46,7 +46,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
@@ -59,6 +59,6 @@ jobs:
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

View File

@@ -79,7 +79,7 @@ jobs:
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app

View File

@@ -8,21 +8,20 @@ on:
env:
CURRENT_BRANCH: ${{ github.ref_name }}
TARGET_BRANCH: ${{ vars.SYNC_TARGET_BRANCH_NAME }} # The target branch that you would like to merge changes like develop
TARGET_BRANCH: "preview" # The target branch that you would like to merge changes like develop
GITHUB_TOKEN: ${{ secrets.ACCESS_TOKEN }} # Personal access token required to modify contents and workflows
REVIEWER: ${{ vars.SYNC_PR_REVIEWER }}
ACCOUNT_USER_NAME: ${{ vars.ACCOUNT_USER_NAME }}
ACCOUNT_USER_EMAIL: ${{ vars.ACCOUNT_USER_EMAIL }}
jobs:
Create_PR:
create_pull_request:
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: write
steps:
- name: Checkout code
uses: actions/checkout@v4.1.1
uses: actions/checkout@v4
with:
fetch-depth: 0 # Fetch all history for all branches and tags
@@ -48,6 +47,6 @@ jobs:
echo "Pull Request already exists: $PR_EXISTS"
else
echo "Creating new pull request"
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $CURRENT_BRANCH --title "sync: community changes" --body "")
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $CURRENT_BRANCH --title "${{ vars.SYNC_PR_TITLE }}" --body "")
echo "Pull Request created: $PR_URL"
fi

View File

@@ -17,7 +17,7 @@ jobs:
contents: read
steps:
- name: Checkout Code
uses: actions/checkout@v4.1.1
uses: actions/checkout@v4
with:
persist-credentials: false
fetch-depth: 0
@@ -35,9 +35,8 @@ jobs:
env:
GH_TOKEN: ${{ secrets.ACCESS_TOKEN }}
run: |
RUN_ID="${{ github.run_id }}"
TARGET_REPO="${{ vars.SYNC_TARGET_REPO }}"
TARGET_BRANCH="sync/${RUN_ID}"
TARGET_BRANCH="${{ vars.SYNC_TARGET_BRANCH_NAME }}"
SOURCE_BRANCH="${{ env.SOURCE_BRANCH_NAME }}"
git checkout $SOURCE_BRANCH

View File

@@ -4,7 +4,7 @@ Thank you for showing an interest in contributing to Plane! All kinds of contrib
## Submitting an issue
Before submitting a new issue, please search the [issues](https://github.com/makeplane/plane/issues) tab. Maybe an issue or discussion already exists and might inform you of workarounds. Otherwise, you can give new informplaneation.
Before submitting a new issue, please search the [issues](https://github.com/makeplane/plane/issues) tab. Maybe an issue or discussion already exists and might inform you of workarounds. Otherwise, you can give new information.
While we want to fix all the [issues](https://github.com/makeplane/plane/issues), before fixing a bug we need to be able to reproduce and confirm it. Please provide us with a minimal reproduction scenario using a repository or [Gist](https://gist.github.com/). Having a live, reproducible scenario gives us the information without asking questions back & forth with additional questions like:

View File

@@ -5,11 +5,13 @@ import { observer } from "mobx-react";
import { useTheme as useNextTheme } from "next-themes";
import { LogOut, UserCog2, Palette } from "lucide-react";
import { Menu, Transition } from "@headlessui/react";
// plane ui
import { Avatar } from "@plane/ui";
// hooks
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { useTheme, useUser } from "@/hooks/store";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { getFileURL } from "@/helpers/file.helper";
// hooks
import { useTheme, useUser } from "@/hooks/store";
// services
import { AuthService } from "@/services/auth.service";
@@ -122,7 +124,7 @@ export const SidebarDropdown = observer(() => {
<Menu.Button className="grid place-items-center outline-none">
<Avatar
name={currentUser.display_name}
src={currentUser.avatar ?? undefined}
src={getFileURL(currentUser.avatar_url)}
size={24}
shape="square"
className="!text-base"

View File

@@ -0,0 +1,14 @@
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
/**
* @description combine the file path with the base URL
* @param {string} path
* @returns {string} final URL with the base URL
*/
export const getFileURL = (path: string): string | undefined => {
if (!path) return undefined;
const isValidURL = path.startsWith("http");
if (isValidURL) return path;
return `${API_BASE_URL}${path}`;
};

View File

@@ -0,0 +1,21 @@
/**
* @description
* This function test whether a URL is valid or not.
*
* It accepts URLs with or without the protocol.
* @param {string} url
* @returns {boolean}
* @example
* checkURLValidity("https://example.com") => true
* checkURLValidity("example.com") => true
* checkURLValidity("example") => false
*/
export const checkURLValidity = (url: string): boolean => {
if (!url) return false;
// regex to support complex query parameters and fragments
const urlPattern =
/^(https?:\/\/)?((([a-z\d-]+\.)*[a-z\d-]+\.[a-z]{2,6})|(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}))(:\d+)?(\/[\w.-]*)*(\?[^#\s]*)?(#[\w-]*)?$/i;
return urlPattern.test(url);
};

View File

@@ -1,6 +1,6 @@
{
"name": "admin",
"version": "0.22.0",
"version": "0.23.1",
"private": true,
"scripts": {
"dev": "turbo run develop",
@@ -22,7 +22,6 @@
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
"axios": "^1.7.4",
"js-cookie": "^3.0.5",
"lodash": "^4.17.21",
"lucide-react": "^0.356.0",
"mobx": "^6.12.0",
@@ -41,9 +40,8 @@
"devDependencies": {
"@plane/eslint-config": "*",
"@plane/typescript-config": "*",
"@types/js-cookie": "^3.0.6",
"@types/node": "18.16.1",
"@types/react": "^18.2.48",
"@types/react": "^18.3.11",
"@types/react-dom": "^18.2.18",
"@types/uuid": "^9.0.8",
"@types/zxcvbn": "^4.4.4",

View File

@@ -57,5 +57,6 @@ ADMIN_BASE_URL=
SPACE_BASE_URL=
APP_BASE_URL=
# Hard delete files after days
HARD_DELETE_AFTER_DAYS=60
HARD_DELETE_AFTER_DAYS=60

View File

@@ -4,6 +4,7 @@ FROM python:3.12.5-alpine AS backend
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV INSTANCE_CHANGELOG_URL https://api.plane.so/api/public/anchor/8e1c2e4c7bc5493eb7731be3862f6960/pages/
WORKDIR /code

View File

@@ -4,6 +4,7 @@ FROM python:3.12.5-alpine AS backend
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV INSTANCE_CHANGELOG_URL https://api.plane.so/api/public/anchor/8e1c2e4c7bc5493eb7731be3862f6960/pages/
RUN apk --no-cache add \
"bash~=5.2" \

View File

@@ -1,4 +1,4 @@
{
"name": "plane-api",
"version": "0.22.0"
"version": "0.23.1"
}

View File

@@ -5,7 +5,6 @@ from .issue import (
IssueSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueAttachmentSerializer,
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
@@ -19,4 +18,4 @@ from .module import (
ModuleIssueSerializer,
ModuleLiteSerializer,
)
from .inbox import InboxIssueSerializer
from .intake import IntakeIssueSerializer

View File

@@ -1,15 +1,17 @@
# Module improts
from .base import BaseSerializer
from .issue import IssueExpandSerializer
from plane.db.models import InboxIssue
from plane.db.models import IntakeIssue
from rest_framework import serializers
class InboxIssueSerializer(BaseSerializer):
class IntakeIssueSerializer(BaseSerializer):
issue_detail = IssueExpandSerializer(read_only=True, source="issue")
inbox = serializers.UUIDField(source="intake.id", read_only=True)
class Meta:
model = InboxIssue
model = IntakeIssue
fields = "__all__"
read_only_fields = [
"id",

View File

@@ -11,7 +11,7 @@ from plane.db.models import (
IssueType,
IssueActivity,
IssueAssignee,
IssueAttachment,
FileAsset,
IssueComment,
IssueLabel,
IssueLink,
@@ -31,6 +31,7 @@ from .user import UserLiteSerializer
from django.core.exceptions import ValidationError
from django.core.validators import URLValidator
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(
@@ -315,7 +316,7 @@ class IssueLinkSerializer(BaseSerializer):
"created_at",
"updated_at",
]
def validate_url(self, value):
# Check URL format
validate_url = URLValidator()
@@ -359,7 +360,7 @@ class IssueLinkSerializer(BaseSerializer):
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
model = IssueAttachment
model = FileAsset
fields = "__all__"
read_only_fields = [
"id",

View File

@@ -19,6 +19,8 @@ class ProjectSerializer(BaseSerializer):
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
cover_image_url = serializers.CharField(read_only=True)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
class Meta:
model = Project
@@ -32,6 +34,7 @@ class ProjectSerializer(BaseSerializer):
"created_by",
"updated_by",
"deleted_at",
"cover_image_url",
]
def validate(self, data):
@@ -87,6 +90,8 @@ class ProjectSerializer(BaseSerializer):
class ProjectLiteSerializer(BaseSerializer):
cover_image_url = serializers.CharField(read_only=True)
class Meta:
model = Project
fields = [
@@ -97,5 +102,6 @@ class ProjectLiteSerializer(BaseSerializer):
"icon_prop",
"emoji",
"description",
"cover_image_url",
]
read_only_fields = fields

View File

@@ -13,6 +13,7 @@ class UserLiteSerializer(BaseSerializer):
"last_name",
"email",
"avatar",
"avatar_url",
"display_name",
"email",
]

View File

@@ -3,7 +3,7 @@ from .state import urlpatterns as state_patterns
from .issue import urlpatterns as issue_patterns
from .cycle import urlpatterns as cycle_patterns
from .module import urlpatterns as module_patterns
from .inbox import urlpatterns as inbox_patterns
from .intake import urlpatterns as intake_patterns
from .member import urlpatterns as member_patterns
urlpatterns = [
@@ -12,6 +12,6 @@ urlpatterns = [
*issue_patterns,
*cycle_patterns,
*module_patterns,
*inbox_patterns,
*intake_patterns,
*member_patterns,
]

View File

@@ -1,17 +0,0 @@
from django.urls import path
from plane.api.views import InboxIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
]

View File

@@ -0,0 +1,27 @@
from django.urls import path
from plane.api.views import IntakeIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
IntakeIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
IntakeIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/",
IntakeIssueAPIEndpoint.as_view(),
name="intake-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/<uuid:issue_id>/",
IntakeIssueAPIEndpoint.as_view(),
name="intake-issue",
),
]

View File

@@ -27,5 +27,4 @@ from .module import (
from .member import ProjectMemberAPIEndpoint
from .inbox import InboxIssueAPIEndpoint
from .intake import IntakeIssueAPIEndpoint

View File

@@ -13,8 +13,12 @@ from django.db.models import (
Q,
Sum,
FloatField,
Case,
When,
Value,
)
from django.db.models.functions import Cast
from django.db.models.functions import Cast, Concat
from django.db import models
# Third party imports
from rest_framework import status
@@ -32,7 +36,7 @@ from plane.db.models import (
CycleIssue,
Issue,
Project,
IssueAttachment,
FileAsset,
IssueLink,
ProjectMember,
UserFavorite,
@@ -74,6 +78,7 @@ class CycleAPIEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -84,6 +89,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -94,6 +100,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -104,6 +111,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -114,6 +122,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -124,6 +133,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -207,8 +217,7 @@ class CycleAPIEndpoint(BaseAPIView):
# Incomplete Cycles
if cycle_view == "incomplete":
queryset = queryset.filter(
Q(end_date__gte=timezone.now().date())
| Q(end_date__isnull=True),
Q(end_date__gte=timezone.now()) | Q(end_date__isnull=True),
)
return self.paginate(
request=request,
@@ -309,10 +318,7 @@ class CycleAPIEndpoint(BaseAPIView):
request_data = request.data
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if cycle.end_date is not None and cycle.end_date < timezone.now():
if "sort_order" in request_data:
# Can only change sort order
request_data = {
@@ -405,10 +411,6 @@ class CycleAPIEndpoint(BaseAPIView):
)
# Delete the cycle
cycle.delete()
# Delete the cycle issues
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk"),
).delete()
# Delete the user favorite cycle
UserFavorite.objects.filter(
entity_type="cycle",
@@ -441,6 +443,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -451,6 +454,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -461,6 +465,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -471,6 +476,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -481,6 +487,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -491,6 +498,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -504,6 +512,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -514,6 +523,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -537,7 +547,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
if cycle.end_date >= timezone.now().date():
if cycle.end_date >= timezone.now():
return Response(
{"error": "Only completed cycles can be archived"},
status=status.HTTP_400_BAD_REQUEST,
@@ -619,7 +629,10 @@ class CycleIssueAPIEndpoint(BaseAPIView):
# List
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
@@ -645,8 +658,9 @@ class CycleIssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -815,6 +829,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -825,6 +840,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -835,6 +851,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -845,6 +862,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -855,6 +873,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -865,6 +884,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -881,13 +901,34 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
assignee_estimate_data = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar", "avatar_url")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
@@ -924,7 +965,8 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
if item["assignee_id"]
else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar", None),
"avatar_url": item.get("avatar_url", None),
"total_estimates": item["total_estimates"],
"completed_estimates": item["completed_estimates"],
"pending_estimates": item["pending_estimates"],
@@ -935,6 +977,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
label_distribution_data = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -996,13 +1039,34 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_issues=Count(
"id",
@@ -1041,7 +1105,8 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
"assignee_id": (
str(item["assignee_id"]) if item["assignee_id"] else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar", None),
"avatar_url": item.get("avatar_url", None),
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
@@ -1053,6 +1118,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -1146,7 +1212,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
if (
new_cycle.end_date is not None
and new_cycle.end_date < timezone.now().date()
and new_cycle.end_date < timezone.now()
):
return Response(
{

View File

@@ -14,12 +14,12 @@ from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.api.serializers import IntakeIssueSerializer, IssueSerializer
from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Inbox,
InboxIssue,
Intake,
IntakeIssue,
Issue,
Project,
ProjectMember,
@@ -29,10 +29,10 @@ from plane.db.models import (
from .base import BaseAPIView
class InboxIssueAPIEndpoint(BaseAPIView):
class IntakeIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to inbox issues.
`update` and `destroy` actions related to intake issues.
"""
@@ -40,15 +40,15 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ProjectLitePermission,
]
serializer_class = InboxIssueSerializer
model = InboxIssue
serializer_class = IntakeIssueSerializer
model = IntakeIssue
filterset_fields = [
"status",
]
def get_queryset(self):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
).first()
@@ -58,16 +58,16 @@ class InboxIssueAPIEndpoint(BaseAPIView):
pk=self.kwargs.get("project_id"),
)
if inbox is None and not project.inbox_view:
return InboxIssue.objects.none()
if intake is None and not project.intake_view:
return IntakeIssue.objects.none()
return (
InboxIssue.objects.filter(
IntakeIssue.objects.filter(
Q(snoozed_till__gte=timezone.now())
| Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=inbox.id,
intake_id=intake.id,
)
.select_related("issue", "workspace", "project")
.order_by(self.kwargs.get("order_by", "-created_at"))
@@ -75,22 +75,22 @@ class InboxIssueAPIEndpoint(BaseAPIView):
def get(self, request, slug, project_id, issue_id=None):
if issue_id:
inbox_issue_queryset = self.get_queryset().get(issue_id=issue_id)
inbox_issue_data = InboxIssueSerializer(
inbox_issue_queryset,
intake_issue_queryset = self.get_queryset().get(issue_id=issue_id)
intake_issue_data = IntakeIssueSerializer(
intake_issue_queryset,
fields=self.fields,
expand=self.expand,
).data
return Response(
inbox_issue_data,
intake_issue_data,
status=status.HTTP_200_OK,
)
issue_queryset = self.get_queryset()
return self.paginate(
request=request,
queryset=(issue_queryset),
on_results=lambda inbox_issues: InboxIssueSerializer(
inbox_issues,
on_results=lambda intake_issues: IntakeIssueSerializer(
intake_issues,
many=True,
fields=self.fields,
expand=self.expand,
@@ -104,7 +104,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
@@ -113,11 +113,11 @@ class InboxIssueAPIEndpoint(BaseAPIView):
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
# Intake view
if intake is None and not project.intake_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
"error": "Intake is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -139,7 +139,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
state, _ = State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Inbox Issues",
description="Default state for managing all Intake Issues",
project_id=project_id,
color="#ff7700",
is_triage=True,
@@ -157,12 +157,12 @@ class InboxIssueAPIEndpoint(BaseAPIView):
state=state,
)
# create an inbox issue
inbox_issue = InboxIssue.objects.create(
inbox_id=inbox.id,
# create an intake issue
intake_issue = IntakeIssue.objects.create(
intake_id=intake.id,
project_id=project_id,
issue=issue,
source=request.data.get("source", "in-app"),
source=request.data.get("source", "IN-APP"),
)
# Create an Issue Activity
issue_activity.delay(
@@ -173,32 +173,37 @@ class InboxIssueAPIEndpoint(BaseAPIView):
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
inbox=str(inbox_issue.id),
intake=str(intake_issue.id),
)
serializer = InboxIssueSerializer(inbox_issue)
serializer = IntakeIssueSerializer(intake_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
# Inbox view
if inbox is None:
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Intake view
if intake is None and not project.intake_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
"error": "Intake is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
# Get the intake issue
intake_issue = IntakeIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
intake_id=intake.id,
)
# Get the project member
@@ -210,11 +215,11 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 5 and str(inbox_issue.created_by_id) != str(
if project_member.role <= 5 and str(intake_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
{"error": "You cannot edit intake issues"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -227,7 +232,10 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -235,7 +243,11 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -276,7 +288,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
inbox=(inbox_issue.id),
intake=(intake_issue.id),
)
issue_serializer.save()
else:
@@ -284,13 +296,13 @@ class InboxIssueAPIEndpoint(BaseAPIView):
issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
# Only project admins and members can edit inbox issue attributes
if project_member.role > 5:
serializer = InboxIssueSerializer(
inbox_issue, data=request.data, partial=True
# Only project admins and members can edit intake issue attributes
if project_member.role > 15:
serializer = IntakeIssueSerializer(
intake_issue, data=request.data, partial=True
)
current_instance = json.dumps(
InboxIssueSerializer(inbox_issue).data, cls=DjangoJSONEncoder
IntakeIssueSerializer(intake_issue).data, cls=DjangoJSONEncoder
)
if serializer.is_valid():
@@ -333,7 +345,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# create a activity for status change
issue_activity.delay(
type="inbox.activity.created",
type="intake.activity.created",
requested_data=json.dumps(
request.data, cls=DjangoJSONEncoder
),
@@ -344,7 +356,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
epoch=int(timezone.now().timestamp()),
notification=False,
origin=request.META.get("HTTP_ORIGIN"),
inbox=str(inbox_issue.id),
intake=str(intake_issue.id),
)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -353,12 +365,12 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
else:
return Response(
InboxIssueSerializer(inbox_issue).data,
IntakeIssueSerializer(intake_issue).data,
status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
@@ -367,25 +379,25 @@ class InboxIssueAPIEndpoint(BaseAPIView):
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
# Intake view
if intake is None and not project.intake_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
"error": "Intake is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
# Get the intake issue
intake_issue = IntakeIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
intake_id=intake.id,
)
# Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]:
if intake_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
issue = Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=issue_id
@@ -405,5 +417,5 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
issue.delete()
inbox_issue.delete()
intake_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -16,6 +16,7 @@ from django.db.models import (
Q,
Value,
When,
Subquery,
)
from django.utils import timezone
@@ -42,12 +43,13 @@ from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueActivity,
IssueAttachment,
FileAsset,
IssueComment,
IssueLink,
Label,
Project,
ProjectMember,
CycleIssue,
)
from .base import BaseAPIView
@@ -202,7 +204,13 @@ class IssueAPIEndpoint(BaseAPIView):
issue_queryset = (
self.get_queryset()
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -210,8 +218,9 @@ class IssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -1062,7 +1071,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
model = IssueAttachment
model = FileAsset
parser_classes = (MultiPartParser, FormParser)
def post(self, request, slug, project_id, issue_id):
@@ -1070,7 +1079,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
if (
request.data.get("external_id")
and request.data.get("external_source")
and IssueAttachment.objects.filter(
and FileAsset.objects.filter(
project_id=project_id,
workspace__slug=slug,
issue_id=issue_id,
@@ -1078,7 +1087,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
external_id=request.data.get("external_id"),
).exists()
):
issue_attachment = IssueAttachment.objects.filter(
issue_attachment = FileAsset.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
@@ -1112,7 +1121,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = IssueAttachment.objects.get(pk=pk)
issue_attachment = FileAsset.objects.get(pk=pk)
issue_attachment.asset.delete(save=False)
issue_attachment.delete()
issue_activity.delay(
@@ -1130,7 +1139,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, project_id, issue_id):
issue_attachments = IssueAttachment.objects.filter(
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachments, many=True)

View File

@@ -21,7 +21,7 @@ from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
Module,
ModuleIssue,
@@ -71,6 +71,7 @@ class ModuleAPIEndpoint(BaseAPIView):
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
),
@@ -82,6 +83,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -93,6 +95,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -104,6 +107,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -115,6 +119,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -126,6 +131,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -367,7 +373,10 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
def get(self, request, slug, project_id, module_id):
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
Issue.issue_objects.filter(
issue_module__module_id=module_id,
issue_module__deleted_at__isnull=True,
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
@@ -393,8 +402,9 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -551,6 +561,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
),
@@ -562,6 +573,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -573,6 +585,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -584,6 +597,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -595,6 +609,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -606,6 +621,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)

View File

@@ -18,7 +18,7 @@ from plane.app.permissions import ProjectBasePermission
# Module imports
from plane.db.models import (
Cycle,
Inbox,
Intake,
IssueUserProperty,
Module,
Project,
@@ -285,6 +285,11 @@ class ProjectAPIEndpoint(BaseAPIView):
current_instance = json.dumps(
ProjectSerializer(project).data, cls=DjangoJSONEncoder
)
intake_view = request.data.get(
"inbox_view", request.data.get("intake_view", False)
)
if project.archived_at:
return Response(
{"error": "Archived project cannot be updated"},
@@ -293,21 +298,24 @@ class ProjectAPIEndpoint(BaseAPIView):
serializer = ProjectSerializer(
project,
data={**request.data},
data={
**request.data,
"intake_view": intake_view,
},
context={"workspace_id": workspace.id},
partial=True,
)
if serializer.is_valid():
serializer.save()
if serializer.data["inbox_view"]:
inbox = Inbox.objects.filter(
if serializer.data["intake_view"]:
intake = Intake.objects.filter(
project=project,
is_default=True,
).first()
if not inbox:
Inbox.objects.create(
name=f"{project.name} Inbox",
if not intake:
Intake.objects.create(
name=f"{project.name} Intake",
project=project,
is_default=True,
)
@@ -316,7 +324,7 @@ class ProjectAPIEndpoint(BaseAPIView):
State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Inbox Issues",
description="Default state for managing all Intake Issues",
project_id=pk,
color="#ff7700",
is_triage=True,

View File

@@ -57,7 +57,7 @@ from .issue import (
IssueFlatSerializer,
IssueStateSerializer,
IssueLinkSerializer,
IssueInboxSerializer,
IssueIntakeSerializer,
IssueLiteSerializer,
IssueAttachmentSerializer,
IssueSubscriberSerializer,
@@ -102,12 +102,12 @@ from .estimate import (
WorkspaceEstimateSerializer,
)
from .inbox import (
InboxSerializer,
InboxIssueSerializer,
IssueStateInboxSerializer,
InboxIssueLiteSerializer,
InboxIssueDetailSerializer,
from .intake import (
IntakeSerializer,
IntakeIssueSerializer,
IssueStateIntakeSerializer,
IntakeIssueLiteSerializer,
IntakeIssueDetailSerializer,
)
from .analytic import AnalyticViewSerializer
@@ -124,3 +124,9 @@ from .webhook import WebhookSerializer, WebhookLogSerializer
from .dashboard import DashboardSerializer, WidgetSerializer
from .favorite import UserFavoriteSerializer
from .draft import (
DraftIssueCreateSerializer,
DraftIssueSerializer,
DraftIssueDetailSerializer,
)

View File

@@ -60,10 +60,10 @@ class DynamicBaseSerializer(BaseSerializer):
CycleIssueSerializer,
IssueLiteSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IntakeIssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
RelatedIssueSerializer,
)
# Expansion mapper
@@ -84,13 +84,14 @@ class DynamicBaseSerializer(BaseSerializer):
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_intake": IntakeIssueLiteSerializer,
"issue_related": RelatedIssueSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
"sub_issues": IssueLiteSerializer,
}
if field not in self.fields and field in expansion:
self.fields[field] = expansion[field](
many=(
True
@@ -101,11 +102,12 @@ class DynamicBaseSerializer(BaseSerializer):
"labels",
"issue_cycle",
"issue_relation",
"issue_inbox",
"issue_intake",
"issue_reactions",
"issue_attachment",
"issue_link",
"sub_issues",
"issue_related",
]
else False
)
@@ -130,11 +132,12 @@ class DynamicBaseSerializer(BaseSerializer):
LabelSerializer,
CycleIssueSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IntakeIssueLiteSerializer,
IssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
RelatedIssueSerializer,
)
# Expansion mapper
@@ -155,7 +158,8 @@ class DynamicBaseSerializer(BaseSerializer):
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_intake": IntakeIssueLiteSerializer,
"issue_related": RelatedIssueSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
@@ -178,4 +182,29 @@ class DynamicBaseSerializer(BaseSerializer):
instance, f"{expand}_id", None
)
# Check if issue_attachments is in fields or expand
if (
"issue_attachments" in self.fields
or "issue_attachments" in self.expand
):
# Import the model here to avoid circular imports
from plane.db.models import FileAsset
issue_id = getattr(instance, "id", None)
if issue_id:
# Fetch related issue_attachments
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
# Serialize issue_attachments and add them to the response
response["issue_attachments"] = (
IssueAttachmentLiteSerializer(
issue_attachments, many=True
).data
)
else:
response["issue_attachments"] = []
return response

View File

@@ -0,0 +1,292 @@
# Django imports
from django.utils import timezone
# Third Party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from plane.db.models import (
User,
Issue,
Label,
State,
DraftIssue,
DraftIssueAssignee,
DraftIssueLabel,
DraftIssueCycle,
DraftIssueModule,
)
class DraftIssueCreateSerializer(BaseSerializer):
# ids
state_id = serializers.PrimaryKeyRelatedField(
source="state",
queryset=State.objects.all(),
required=False,
allow_null=True,
)
parent_id = serializers.PrimaryKeyRelatedField(
source="parent",
queryset=Issue.objects.all(),
required=False,
allow_null=True,
)
label_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = DraftIssue
fields = "__all__"
read_only_fields = [
"workspace",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
assignee_ids = self.initial_data.get("assignee_ids")
data["assignee_ids"] = assignee_ids if assignee_ids else []
label_ids = self.initial_data.get("label_ids")
data["label_ids"] = label_ids if label_ids else []
return data
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
return data
def create(self, validated_data):
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
modules = validated_data.pop("module_ids", None)
cycle_id = self.initial_data.get("cycle_id", None)
modules = self.initial_data.get("module_ids", None)
workspace_id = self.context["workspace_id"]
project_id = self.context["project_id"]
# Create Issue
issue = DraftIssue.objects.create(
**validated_data,
workspace_id=workspace_id,
project_id=project_id,
)
# Issue Audit Users
created_by_id = issue.created_by_id
updated_by_id = issue.updated_by_id
if assignees is not None and len(assignees):
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
draft_issue=issue,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None and len(labels):
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
if cycle_id is not None:
DraftIssueCycle.objects.create(
cycle_id=cycle_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if modules is not None and len(modules):
DraftIssueModule.objects.bulk_create(
[
DraftIssueModule(
module_id=module_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for module_id in modules
],
batch_size=10,
)
return issue
def update(self, instance, validated_data):
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
cycle_id = self.context.get("cycle_id", None)
modules = self.initial_data.get("module_ids", None)
# Related models
workspace_id = instance.workspace_id
project_id = instance.project_id
created_by_id = instance.created_by_id
updated_by_id = instance.updated_by_id
if assignees is not None:
DraftIssueAssignee.objects.filter(draft_issue=instance).delete()
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None:
DraftIssueLabel.objects.filter(draft_issue=instance).delete()
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
if cycle_id != "not_provided":
DraftIssueCycle.objects.filter(draft_issue=instance).delete()
if cycle_id:
DraftIssueCycle.objects.create(
cycle_id=cycle_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if modules is not None:
DraftIssueModule.objects.filter(draft_issue=instance).delete()
DraftIssueModule.objects.bulk_create(
[
DraftIssueModule(
module_id=module_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for module_id in modules
],
batch_size=10,
)
# Time updation occurs even when other related models are updated
instance.updated_at = timezone.now()
return super().update(instance, validated_data)
class DraftIssueSerializer(BaseSerializer):
# ids
cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
module_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
# Many to many
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
class Meta:
model = DraftIssue
fields = [
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"created_at",
"updated_at",
"created_by",
"updated_by",
"type_id",
"description_html",
]
read_only_fields = fields
class DraftIssueDetailSerializer(DraftIssueSerializer):
description_html = serializers.CharField()
class Meta(DraftIssueSerializer.Meta):
fields = DraftIssueSerializer.Meta.fields + [
"description_html",
]
read_only_fields = fields

View File

@@ -4,22 +4,22 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .issue import (
IssueInboxSerializer,
IssueIntakeSerializer,
LabelLiteSerializer,
IssueDetailSerializer,
)
from .project import ProjectLiteSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer
from plane.db.models import Inbox, InboxIssue, Issue
from plane.db.models import Intake, IntakeIssue, Issue
class InboxSerializer(BaseSerializer):
class IntakeSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(source="project", read_only=True)
pending_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = Inbox
model = Intake
fields = "__all__"
read_only_fields = [
"project",
@@ -27,11 +27,11 @@ class InboxSerializer(BaseSerializer):
]
class InboxIssueSerializer(BaseSerializer):
issue = IssueInboxSerializer(read_only=True)
class IntakeIssueSerializer(BaseSerializer):
issue = IssueIntakeSerializer(read_only=True)
class Meta:
model = InboxIssue
model = IntakeIssue
fields = [
"id",
"status",
@@ -53,14 +53,14 @@ class InboxIssueSerializer(BaseSerializer):
return super().to_representation(instance)
class InboxIssueDetailSerializer(BaseSerializer):
class IntakeIssueDetailSerializer(BaseSerializer):
issue = IssueDetailSerializer(read_only=True)
duplicate_issue_detail = IssueInboxSerializer(
duplicate_issue_detail = IssueIntakeSerializer(
read_only=True, source="duplicate_to"
)
class Meta:
model = InboxIssue
model = IntakeIssue
fields = [
"id",
"status",
@@ -69,6 +69,7 @@ class InboxIssueDetailSerializer(BaseSerializer):
"duplicate_issue_detail",
"source",
"issue",
"created_by",
]
read_only_fields = [
"project",
@@ -85,14 +86,14 @@ class InboxIssueDetailSerializer(BaseSerializer):
return super().to_representation(instance)
class InboxIssueLiteSerializer(BaseSerializer):
class IntakeIssueLiteSerializer(BaseSerializer):
class Meta:
model = InboxIssue
model = IntakeIssue
fields = ["id", "status", "duplicate_to", "snoozed_till", "source"]
read_only_fields = fields
class IssueStateInboxSerializer(BaseSerializer):
class IssueStateIntakeSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
label_details = LabelLiteSerializer(
@@ -102,7 +103,7 @@ class IssueStateInboxSerializer(BaseSerializer):
read_only=True, source="assignees", many=True
)
sub_issues_count = serializers.IntegerField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
issue_intake = IntakeIssueLiteSerializer(read_only=True, many=True)
class Meta:
model = Issue

View File

@@ -27,7 +27,7 @@ from plane.db.models import (
Module,
ModuleIssue,
IssueLink,
IssueAttachment,
FileAsset,
IssueReaction,
CommentReaction,
IssueVote,
@@ -95,6 +95,8 @@ class IssueCreateSerializer(BaseSerializer):
write_only=True,
required=False,
)
project_id = serializers.UUIDField(source="project.id", read_only=True)
workspace_id = serializers.UUIDField(source="workspace.id", read_only=True)
class Meta:
model = Issue
@@ -498,8 +500,11 @@ class IssueLinkLiteSerializer(BaseSerializer):
class IssueAttachmentSerializer(BaseSerializer):
asset_url = serializers.CharField(read_only=True)
class Meta:
model = IssueAttachment
model = FileAsset
fields = "__all__"
read_only_fields = [
"created_by",
@@ -514,14 +519,15 @@ class IssueAttachmentSerializer(BaseSerializer):
class IssueAttachmentLiteSerializer(DynamicBaseSerializer):
class Meta:
model = IssueAttachment
model = FileAsset
fields = [
"id",
"asset",
"attributes",
"issue_id",
# "issue_id",
"updated_at",
"updated_by",
"asset_url",
]
read_only_fields = fields
@@ -639,7 +645,7 @@ class IssueStateSerializer(DynamicBaseSerializer):
fields = "__all__"
class IssueInboxSerializer(DynamicBaseSerializer):
class IssueIntakeSerializer(DynamicBaseSerializer):
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,

View File

@@ -12,6 +12,7 @@ class NotificationSerializer(BaseSerializer):
read_only=True, source="triggered_by"
)
is_inbox_issue = serializers.BooleanField(read_only=True)
is_intake_issue = serializers.BooleanField(read_only=True)
is_mentioned_notification = serializers.BooleanField(read_only=True)
class Meta:

View File

@@ -22,6 +22,7 @@ class ProjectSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
class Meta:
model = Project
@@ -95,6 +96,7 @@ class ProjectLiteSerializer(BaseSerializer):
"identifier",
"name",
"cover_image",
"cover_image_url",
"logo_props",
"description",
]
@@ -117,6 +119,8 @@ class ProjectListSerializer(DynamicBaseSerializer):
member_role = serializers.IntegerField(read_only=True)
anchor = serializers.CharField(read_only=True)
members = serializers.SerializerMethodField()
cover_image_url = serializers.CharField(read_only=True)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
def get_members(self, obj):
project_members = getattr(obj, "members_list", None)
@@ -128,6 +132,7 @@ class ProjectListSerializer(DynamicBaseSerializer):
"member_id": member.member_id,
"member__display_name": member.member.display_name,
"member__avatar": member.member.avatar,
"member__avatar_url": member.member.avatar_url,
}
for member in project_members
]

View File

@@ -56,12 +56,15 @@ class UserSerializer(BaseSerializer):
class UserMeSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"avatar",
"cover_image",
"avatar_url",
"cover_image_url",
"date_joined",
"display_name",
"email",
@@ -156,6 +159,7 @@ class UserLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"avatar_url",
"is_bot",
"display_name",
]
@@ -173,6 +177,7 @@ class UserAdminLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"avatar_url",
"is_bot",
"display_name",
"email",

View File

@@ -22,6 +22,7 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
logo_url = serializers.CharField(read_only=True)
def validate_slug(self, value):
# Check if the slug is restricted
@@ -39,6 +40,7 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
"created_at",
"updated_at",
"owner",
"logo_url",
]
@@ -63,6 +65,7 @@ class WorkSpaceMemberSerializer(DynamicBaseSerializer):
class WorkspaceMemberMeSerializer(BaseSerializer):
draft_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"

View File

@@ -5,7 +5,7 @@ from .cycle import urlpatterns as cycle_urls
from .dashboard import urlpatterns as dashboard_urls
from .estimate import urlpatterns as estimate_urls
from .external import urlpatterns as external_urls
from .inbox import urlpatterns as inbox_urls
from .intake import urlpatterns as intake_urls
from .issue import urlpatterns as issue_urls
from .module import urlpatterns as module_urls
from .notification import urlpatterns as notification_urls
@@ -25,7 +25,7 @@ urlpatterns = [
*dashboard_urls,
*estimate_urls,
*external_urls,
*inbox_urls,
*intake_urls,
*issue_urls,
*module_urls,
*notification_urls,

View File

@@ -5,6 +5,13 @@ from plane.app.views import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
# V2 Endpoints
WorkspaceFileAssetEndpoint,
UserAssetsV2Endpoint,
StaticFileAssetEndpoint,
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
)
@@ -38,4 +45,49 @@ urlpatterns = [
),
name="file-assets-restore",
),
# V2 Endpoints
path(
"assets/v2/workspaces/<str:slug>/",
WorkspaceFileAssetEndpoint.as_view(),
name="workspace-file-assets",
),
path(
"assets/v2/workspaces/<str:slug>/<uuid:asset_id>/",
WorkspaceFileAssetEndpoint.as_view(),
name="workspace-file-assets",
),
path(
"assets/v2/user-assets/",
UserAssetsV2Endpoint.as_view(),
name="user-file-assets",
),
path(
"assets/v2/user-assets/<uuid:asset_id>/",
UserAssetsV2Endpoint.as_view(),
name="user-file-assets",
),
path(
"assets/v2/workspaces/<str:slug>/restore/<uuid:asset_id>/",
AssetRestoreEndpoint.as_view(),
name="asset-restore",
),
path(
"assets/v2/static/<uuid:asset_id>/",
StaticFileAssetEndpoint.as_view(),
name="static-file-asset",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:pk>/",
ProjectAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:entity_id>/bulk/",
ProjectBulkAssetEndpoint.as_view(),
),
]

View File

@@ -1,53 +0,0 @@
from django.urls import path
from plane.app.views import (
InboxViewSet,
InboxIssueViewSet,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/",
InboxViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:pk>/",
InboxViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
InboxIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:pk>/",
InboxIssueViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox-issue",
),
]

View File

@@ -0,0 +1,95 @@
from django.urls import path
from plane.app.views import (
IntakeViewSet,
IntakeIssueViewSet,
)
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intakes/",
IntakeViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="intake",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intakes/<uuid:pk>/",
IntakeViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="intake",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/",
IntakeIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="intake-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/<uuid:pk>/",
IntakeIssueViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="intake-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/",
IntakeViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:pk>/",
IntakeViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
IntakeIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:pk>/",
IntakeIssueViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="inbox-issue",
),
]

View File

@@ -11,7 +11,6 @@ from plane.app.views import (
IssueActivityEndpoint,
IssueArchiveViewSet,
IssueCommentViewSet,
IssueDraftViewSet,
IssueListEndpoint,
IssueReactionViewSet,
IssueRelationViewSet,
@@ -22,6 +21,9 @@ from plane.app.views import (
BulkArchiveIssuesEndpoint,
DeletedIssuesListViewSet,
IssuePaginatedViewSet,
IssueDetailEndpoint,
IssueAttachmentV2Endpoint,
IssueBulkUpdateDateEndpoint,
)
urlpatterns = [
@@ -40,9 +42,15 @@ urlpatterns = [
),
name="project-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues-detail/",
IssueDetailEndpoint.as_view(),
name="project-issue-detail",
),
# updated v1 paginated issues
# updated v2 paginated issues
path(
"workspaces/<str:slug>/v2/issues/",
"workspaces/<str:slug>/projects/<uuid:project_id>/v2/issues/",
IssuePaginatedViewSet.as_view({"get": "list"}),
name="project-issues-paginated",
),
@@ -133,6 +141,18 @@ urlpatterns = [
IssueAttachmentEndpoint.as_view(),
name="project-issue-attachments",
),
# V2 Attachments
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/attachments/",
IssueAttachmentV2Endpoint.as_view(),
name="project-issue-attachments",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/attachments/<uuid:pk>/",
IssueAttachmentV2Endpoint.as_view(),
name="project-issue-attachments",
),
## Export Issues
path(
"workspaces/<str:slug>/export-issues/",
ExportIssuesEndpoint.as_view(),
@@ -290,31 +310,14 @@ urlpatterns = [
name="issue-relation",
),
## End Issue Relation
## Issue Drafts
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/",
IssueDraftViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/<uuid:pk>/",
IssueDraftViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/deleted-issues/",
DeletedIssuesListViewSet.as_view(),
name="deleted-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-dates/",
IssueBulkUpdateDateEndpoint.as_view(),
name="project-issue-dates",
),
]

View File

@@ -27,6 +27,7 @@ from plane.app.views import (
WorkspaceCyclesEndpoint,
WorkspaceFavoriteEndpoint,
WorkspaceFavoriteGroupEndpoint,
WorkspaceDraftIssueViewSet,
)
@@ -254,4 +255,30 @@ urlpatterns = [
WorkspaceFavoriteGroupEndpoint.as_view(),
name="workspace-user-favorites-groups",
),
path(
"workspaces/<str:slug>/draft-issues/",
WorkspaceDraftIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="workspace-draft-issues",
),
path(
"workspaces/<str:slug>/draft-issues/<uuid:pk>/",
WorkspaceDraftIssueViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="workspace-drafts-issues",
),
path(
"workspaces/<str:slug>/draft-to-issue/<uuid:draft_id>/",
WorkspaceDraftIssueViewSet.as_view({"post": "create_draft_to_issue"}),
name="workspace-drafts-issues",
),
]

View File

@@ -40,6 +40,8 @@ from .workspace.base import (
ExportWorkspaceUserActivityEndpoint,
)
from .workspace.draft import WorkspaceDraftIssueViewSet
from .workspace.favorite import (
WorkspaceFavoriteEndpoint,
WorkspaceFavoriteGroupEndpoint,
@@ -108,7 +110,19 @@ from .cycle.archive import (
CycleArchiveUnarchiveEndpoint,
)
from .asset.base import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
from .asset.base import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
)
from .asset.v2 import (
WorkspaceFileAssetEndpoint,
UserAssetsV2Endpoint,
StaticFileAssetEndpoint,
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
)
from .issue.base import (
IssueListEndpoint,
IssueViewSet,
@@ -116,6 +130,8 @@ from .issue.base import (
BulkDeleteIssuesEndpoint,
DeletedIssuesListViewSet,
IssuePaginatedViewSet,
IssueDetailEndpoint,
IssueBulkUpdateDateEndpoint,
)
from .issue.activity import (
@@ -126,6 +142,8 @@ from .issue.archive import IssueArchiveViewSet, BulkArchiveIssuesEndpoint
from .issue.attachment import (
IssueAttachmentEndpoint,
# V2
IssueAttachmentV2Endpoint,
)
from .issue.comment import (
@@ -133,8 +151,6 @@ from .issue.comment import (
CommentReactionViewSet,
)
from .issue.draft import IssueDraftViewSet
from .issue.label import (
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
@@ -204,7 +220,7 @@ from .estimate.base import (
EstimatePointEndpoint,
)
from .inbox.base import InboxViewSet, InboxIssueViewSet
from .intake.base import IntakeViewSet, IntakeIssueViewSet
from .analytic.base import (
AnalyticsEndpoint,

View File

@@ -1,7 +1,10 @@
# Django imports
from django.db.models import Count, F, Sum
from django.db.models import Count, F, Sum, Q
from django.db.models.functions import ExtractMonth
from django.utils import timezone
from django.db.models.functions import Concat
from django.db.models import Case, When, Value
from django.db import models
# Third party imports
from rest_framework import status
@@ -107,7 +110,10 @@ class AnalyticsEndpoint(BaseAPIView):
if x_axis in ["labels__id"] or segment in ["labels__id"]:
label_details = (
Issue.objects.filter(
workspace__slug=slug, **filters, labels__id__isnull=False
workspace__slug=slug,
**filters,
labels__id__isnull=False,
label_issue__deleted_at__isnull=True,
)
.distinct("labels__id")
.order_by("labels__id")
@@ -118,14 +124,37 @@ class AnalyticsEndpoint(BaseAPIView):
if x_axis in ["assignees__id"] or segment in ["assignees__id"]:
assignee_details = (
Issue.issue_objects.filter(
Q(
Q(assignees__avatar__isnull=False)
| Q(assignees__avatar_asset__isnull=False)
),
workspace__slug=slug,
**filters,
assignees__avatar__isnull=False,
)
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("assignees__id")
.distinct("assignees__id")
.values(
"assignees__avatar",
"assignees__avatar_url",
"assignees__display_name",
"assignees__first_name",
"assignees__last_name",
@@ -142,6 +171,7 @@ class AnalyticsEndpoint(BaseAPIView):
workspace__slug=slug,
**filters,
issue_cycle__cycle_id__isnull=False,
issue_cycle__deleted_at__isnull=True,
)
.distinct("issue_cycle__cycle_id")
.order_by("issue_cycle__cycle_id")
@@ -160,6 +190,7 @@ class AnalyticsEndpoint(BaseAPIView):
workspace__slug=slug,
**filters,
issue_module__module_id__isnull=False,
issue_module__deleted_at__isnull=True,
)
.distinct("issue_module__module_id")
.order_by("issue_module__module_id")
@@ -355,7 +386,6 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
user_details = [
"created_by__first_name",
"created_by__last_name",
"created_by__avatar",
"created_by__display_name",
"created_by__id",
]
@@ -364,13 +394,32 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
base_issues.exclude(created_by=None)
.values(*user_details)
.annotate(count=Count("id"))
.annotate(
created_by__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
created_by__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"created_by__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
created_by__avatar_asset__isnull=True,
then="created_by__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("-count")[:5]
)
user_assignee_details = [
"assignees__first_name",
"assignees__last_name",
"assignees__avatar",
"assignees__display_name",
"assignees__id",
]
@@ -379,6 +428,26 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
base_issues.filter(completed_at__isnull=False)
.exclude(assignees=None)
.values(*user_assignee_details)
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.annotate(count=Count("id"))
.order_by("-count")[:5]
)
@@ -387,6 +456,26 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
base_issues.filter(completed_at__isnull=True)
.values(*user_assignee_details)
.annotate(count=Count("id"))
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("-count")
)

View File

@@ -50,7 +50,7 @@ class FileAssetEndpoint(BaseAPIView):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
file_asset.is_deleted = True
file_asset.save()
file_asset.save(update_fields=["is_deleted"])
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -59,7 +59,7 @@ class FileAssetViewSet(BaseViewSet):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
file_asset.is_deleted = False
file_asset.save()
file_asset.save(update_fields=["is_deleted"])
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -96,5 +96,5 @@ class UserAssetsEndpoint(BaseAPIView):
asset=asset_key, created_by=request.user
)
file_asset.is_deleted = True
file_asset.save()
file_asset.save(update_fields=["is_deleted"])
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -0,0 +1,803 @@
# Python imports
import uuid
# Django imports
from django.conf import settings
from django.http import HttpResponseRedirect
from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
# Module imports
from ..base import BaseAPIView
from plane.db.models import (
FileAsset,
Workspace,
Project,
User,
)
from plane.settings.storage import S3Storage
from plane.app.permissions import allow_permission, ROLE
from plane.utils.cache import invalidate_cache_directly
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
class UserAssetsV2Endpoint(BaseAPIView):
"""This endpoint is used to upload user profile images."""
def asset_delete(self, asset_id):
asset = FileAsset.objects.filter(id=asset_id).first()
if asset is None:
return
asset.is_deleted = True
asset.deleted_at = timezone.now()
asset.save(update_fields=["is_deleted", "deleted_at"])
return
def entity_asset_save(self, asset_id, entity_type, asset, request):
# User Avatar
if entity_type == FileAsset.EntityTypeContext.USER_AVATAR:
user = User.objects.get(id=asset.user_id)
user.avatar = ""
# Delete the previous avatar
if user.avatar_asset_id:
self.asset_delete(user.avatar_asset_id)
# Save the new avatar
user.avatar_asset_id = asset_id
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
# User Cover
if entity_type == FileAsset.EntityTypeContext.USER_COVER:
user = User.objects.get(id=asset.user_id)
user.cover_image = None
# Delete the previous cover image
if user.cover_image_asset_id:
self.asset_delete(user.cover_image_asset_id)
# Save the new cover image
user.cover_image_asset_id = asset_id
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
return
def entity_asset_delete(self, entity_type, asset, request):
# User Avatar
if entity_type == FileAsset.EntityTypeContext.USER_AVATAR:
user = User.objects.get(id=asset.user_id)
user.avatar_asset_id = None
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
# User Cover
if entity_type == FileAsset.EntityTypeContext.USER_COVER:
user = User.objects.get(id=asset.user_id)
user.cover_image_asset_id = None
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
return
def post(self, request):
# get the asset key
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type", False)
# Check if the file size is within the limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Check if the entity type is allowed
if not entity_type or entity_type not in ["USER_AVATAR", "USER_COVER"]:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# asset key
asset_key = f"{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
user=request.user,
created_by=request.user,
entity_type=entity_type,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
def patch(self, request, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
# get the entity and save the asset id for the request field
self.entity_asset_save(
asset_id=asset_id,
entity_type=asset.entity_type,
asset=asset,
request=request,
)
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save(update_fields=["is_uploaded", "attributes"])
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, asset_id):
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
asset.is_deleted = True
asset.deleted_at = timezone.now()
# get the entity and save the asset id for the request field
self.entity_asset_delete(
entity_type=asset.entity_type, asset=asset, request=request
)
asset.save(update_fields=["is_deleted", "deleted_at"])
return Response(status=status.HTTP_204_NO_CONTENT)
class WorkspaceFileAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload cover images/logos etc for workspace, projects and users."""
def get_entity_id_field(self, entity_type, entity_id):
# Workspace Logo
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
return {
"workspace_id": entity_id,
}
# Project Cover
if entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
return {
"project_id": entity_id,
}
# User Avatar and Cover
if entity_type in [
FileAsset.EntityTypeContext.USER_AVATAR,
FileAsset.EntityTypeContext.USER_COVER,
]:
return {
"user_id": entity_id,
}
# Issue Attachment and Description
if entity_type in [
FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
FileAsset.EntityTypeContext.ISSUE_DESCRIPTION,
]:
return {
"issue_id": entity_id,
}
# Page Description
if entity_type == FileAsset.EntityTypeContext.PAGE_DESCRIPTION:
return {
"page_id": entity_id,
}
# Comment Description
if entity_type == FileAsset.EntityTypeContext.COMMENT_DESCRIPTION:
return {
"comment_id": entity_id,
}
return {}
def asset_delete(self, asset_id):
asset = FileAsset.objects.filter(id=asset_id).first()
# Check if the asset exists
if asset is None:
return
# Mark the asset as deleted
asset.is_deleted = True
asset.deleted_at = timezone.now()
asset.save(update_fields=["is_deleted", "deleted_at"])
return
def entity_asset_save(self, asset_id, entity_type, asset, request):
# Workspace Logo
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
workspace = Workspace.objects.filter(id=asset.workspace_id).first()
if workspace is None:
return
# Delete the previous logo
if workspace.logo_asset_id:
self.asset_delete(workspace.logo_asset_id)
# Save the new logo
workspace.logo = ""
workspace.logo_asset_id = asset_id
workspace.save()
invalidate_cache_directly(
path="/api/workspaces/",
url_params=False,
user=False,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/workspaces/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/instances/",
url_params=False,
user=False,
request=request,
)
return
# Project Cover
elif entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
project = Project.objects.filter(id=asset.workspace_id).first()
if project is None:
return
# Delete the previous cover image
if project.cover_image_asset_id:
self.asset_delete(project.cover_image_asset_id)
# Save the new cover image
project.cover_image = ""
project.cover_image_asset_id = asset_id
project.save()
return
else:
return
def entity_asset_delete(self, entity_type, asset, request):
# Workspace Logo
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
workspace = Workspace.objects.get(id=asset.workspace_id)
if workspace is None:
return
workspace.logo_asset_id = None
workspace.save()
invalidate_cache_directly(
path="/api/workspaces/",
url_params=False,
user=False,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/workspaces/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/instances/",
url_params=False,
user=False,
request=request,
)
return
# Project Cover
elif entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
project = Project.objects.filter(id=asset.project_id).first()
if project is None:
return
project.cover_image_asset_id = None
project.save()
return
else:
return
def post(self, request, slug):
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type")
entity_identifier = request.data.get("entity_identifier", False)
# Check if the entity type is allowed
if entity_type not in FileAsset.EntityTypeContext.values:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the size limit
size_limit = min(settings.FILE_SIZE_LIMIT, size)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
workspace=workspace,
created_by=request.user,
entity_type=entity_type,
**self.get_entity_id_field(
entity_type=entity_type, entity_id=entity_identifier
),
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
def patch(self, request, slug, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id, workspace__slug=slug)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
# get the entity and save the asset id for the request field
self.entity_asset_save(
asset_id=asset_id,
entity_type=asset.entity_type,
asset=asset,
request=request,
)
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save(update_fields=["is_uploaded", "attributes"])
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, asset_id):
asset = FileAsset.objects.get(id=asset_id, workspace__slug=slug)
asset.is_deleted = True
asset.deleted_at = timezone.now()
# get the entity and save the asset id for the request field
self.entity_asset_delete(
entity_type=asset.entity_type, asset=asset, request=request
)
asset.save(update_fields=["is_deleted", "deleted_at"])
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id, workspace__slug=slug)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
)
# Redirect to the signed URL
return HttpResponseRedirect(signed_url)
class StaticFileAssetEndpoint(BaseAPIView):
"""This endpoint is used to get the signed URL for a static asset."""
permission_classes = [
AllowAny,
]
def get(self, request, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Check if the entity type is allowed
if asset.entity_type not in [
FileAsset.EntityTypeContext.USER_AVATAR,
FileAsset.EntityTypeContext.USER_COVER,
FileAsset.EntityTypeContext.WORKSPACE_LOGO,
FileAsset.EntityTypeContext.PROJECT_COVER,
]:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
)
# Redirect to the signed URL
return HttpResponseRedirect(signed_url)
class AssetRestoreEndpoint(BaseAPIView):
"""Endpoint to restore a deleted assets."""
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE")
def post(self, request, slug, asset_id):
asset = FileAsset.all_objects.get(id=asset_id, workspace__slug=slug)
asset.is_deleted = False
asset.deleted_at = None
asset.save(update_fields=["is_deleted", "deleted_at"])
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload cover images/logos etc for workspace, projects and users."""
def get_entity_id_field(self, entity_type, entity_id):
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
return {
"workspace_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
return {
"project_id": entity_id,
}
if entity_type in [
FileAsset.EntityTypeContext.USER_AVATAR,
FileAsset.EntityTypeContext.USER_COVER,
]:
return {
"user_id": entity_id,
}
if entity_type in [
FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
FileAsset.EntityTypeContext.ISSUE_DESCRIPTION,
]:
return {
"issue_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.PAGE_DESCRIPTION:
return {
"page_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.COMMENT_DESCRIPTION:
return {
"comment_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.DRAFT_ISSUE_DESCRIPTION:
return {
"draft_issue_id": entity_id,
}
return {}
@allow_permission(
[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST],
)
def post(self, request, slug, project_id):
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type", "")
entity_identifier = request.data.get("entity_identifier")
# Check if the entity type is allowed
if entity_type not in FileAsset.EntityTypeContext.values:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the size limit
size_limit = min(settings.FILE_SIZE_LIMIT, size)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
workspace=workspace,
created_by=request.user,
entity_type=entity_type,
project_id=project_id,
**self.get_entity_id_field(entity_type, entity_identifier),
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@allow_permission(
[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST],
)
def patch(self, request, slug, project_id, pk):
# get the asset id
asset = FileAsset.objects.get(
id=pk,
)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(pk))
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save(update_fields=["is_uploaded", "attributes"])
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def delete(self, request, slug, project_id, pk):
# Get the asset
asset = FileAsset.objects.get(
id=pk,
workspace__slug=slug,
project_id=project_id,
)
# Check deleted assets
asset.is_deleted = True
asset.deleted_at = timezone.now()
# Save the asset
asset.save(update_fields=["is_deleted", "deleted_at"])
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id, pk):
# get the asset id
asset = FileAsset.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=pk,
)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
)
# Redirect to the signed URL
return HttpResponseRedirect(signed_url)
class ProjectBulkAssetEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def post(self, request, slug, project_id, entity_id):
asset_ids = request.data.get("asset_ids", [])
# Check if the asset ids are provided
if not asset_ids:
return Response(
{
"error": "No asset ids provided.",
},
status=status.HTTP_400_BAD_REQUEST,
)
# get the asset id
assets = FileAsset.objects.filter(
id__in=asset_ids,
workspace__slug=slug,
)
# Get the first asset
asset = assets.first()
if not asset:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Check if the asset is uploaded
if asset.entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
assets.update(
project_id=project_id,
)
if asset.entity_type == FileAsset.EntityTypeContext.ISSUE_DESCRIPTION:
assets.update(
issue_id=entity_id,
)
if (
asset.entity_type
== FileAsset.EntityTypeContext.COMMENT_DESCRIPTION
):
assets.update(
comment_id=entity_id,
)
if asset.entity_type == FileAsset.EntityTypeContext.PAGE_DESCRIPTION:
assets.update(
page_id=entity_id,
)
if (
asset.entity_type
== FileAsset.EntityTypeContext.DRAFT_ISSUE_DESCRIPTION
):
assets.update(
draft_issue_id=entity_id,
)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,6 +1,7 @@
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db import models
from django.db.models import (
Case,
CharField,
@@ -18,7 +19,7 @@ from django.db.models import (
Sum,
FloatField,
)
from django.db.models.functions import Coalesce, Cast
from django.db.models.functions import Coalesce, Cast, Concat
from django.utils import timezone
# Third party imports
@@ -47,6 +48,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="backlog",
issue_cycle__cycle_id=OuterRef("pk"),
issue_cycle__deleted_at__isnull=True,
)
.values("issue_cycle__cycle_id")
.annotate(
@@ -61,6 +63,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="unstarted",
issue_cycle__cycle_id=OuterRef("pk"),
issue_cycle__deleted_at__isnull=True,
)
.values("issue_cycle__cycle_id")
.annotate(
@@ -75,6 +78,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="started",
issue_cycle__cycle_id=OuterRef("pk"),
issue_cycle__deleted_at__isnull=True,
)
.values("issue_cycle__cycle_id")
.annotate(
@@ -89,6 +93,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="cancelled",
issue_cycle__cycle_id=OuterRef("pk"),
issue_cycle__deleted_at__isnull=True,
)
.values("issue_cycle__cycle_id")
.annotate(
@@ -103,6 +108,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="completed",
issue_cycle__cycle_id=OuterRef("pk"),
issue_cycle__deleted_at__isnull=True,
)
.values("issue_cycle__cycle_id")
.annotate(
@@ -116,6 +122,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
estimate_point__estimate__type="points",
issue_cycle__cycle_id=OuterRef("pk"),
issue_cycle__deleted_at__isnull=True,
)
.values("issue_cycle__cycle_id")
.annotate(
@@ -139,7 +146,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
Prefetch(
"issue_cycle__issue__assignees",
queryset=User.objects.only(
"avatar", "first_name", "id"
"avatar_asset", "first_name", "id"
).distinct(),
)
)
@@ -159,6 +166,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -170,6 +178,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -181,6 +190,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -192,6 +202,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -203,6 +214,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -214,6 +226,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -341,6 +354,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
project_id=self.kwargs.get("project_id"),
parent__isnull=False,
issue_cycle__cycle_id=pk,
issue_cycle__deleted_at__isnull=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -395,13 +409,33 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=pk,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
@@ -433,6 +467,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=pk,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -488,19 +523,39 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=pk,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(first_name=F("assignees__first_name"))
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.annotate(display_name=F("assignees__display_name"))
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"avatar_url",
"display_name",
)
.annotate(
@@ -539,6 +594,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=pk,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -604,7 +660,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
if cycle.end_date >= timezone.now().date():
if cycle.end_date >= timezone.now():
return Response(
{"error": "Only completed cycles can be archived"},
status=status.HTTP_400_BAD_REQUEST,

View File

@@ -20,7 +20,8 @@ from django.db.models import (
Sum,
FloatField,
)
from django.db.models.functions import Coalesce, Cast
from django.db import models
from django.db.models.functions import Coalesce, Cast, Concat
from django.utils import timezone
from django.core.serializers.json import DjangoJSONEncoder
@@ -47,6 +48,7 @@ from plane.db.models import (
)
from plane.utils.analytics_plot import burndown_plot
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.user_timezone_converter import user_timezone_converter
# Module imports
from .. import BaseAPIView, BaseViewSet
@@ -81,7 +83,7 @@ class CycleViewSet(BaseViewSet):
Prefetch(
"issue_cycle__issue__assignees",
queryset=User.objects.only(
"avatar", "first_name", "id"
"avatar_asset", "first_name", "id"
).distinct(),
)
)
@@ -101,6 +103,7 @@ class CycleViewSet(BaseViewSet):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -112,6 +115,7 @@ class CycleViewSet(BaseViewSet):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -141,6 +145,11 @@ class CycleViewSet(BaseViewSet):
distinct=True,
filter=~Q(
issue_cycle__issue__assignees__id__isnull=True
)
& (
Q(
issue_cycle__issue__issue_assignee__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
@@ -187,10 +196,20 @@ class CycleViewSet(BaseViewSet):
"completed_issues",
"assignee_ids",
"status",
"version",
"created_by",
)
if data:
datetime_fields = [
"created_at",
"updated_at",
"start_date",
"end_date",
]
data = user_timezone_converter(
data, datetime_fields, request.user.user_timezone
)
return Response(data, status=status.HTTP_200_OK)
data = queryset.values(
@@ -216,8 +235,18 @@ class CycleViewSet(BaseViewSet):
"completed_issues",
"assignee_ids",
"status",
"version",
"created_by",
)
datetime_fields = [
"created_at",
"updated_at",
"start_date",
"end_date",
]
data = user_timezone_converter(
data, datetime_fields, request.user.user_timezone
)
return Response(data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
@@ -255,6 +284,7 @@ class CycleViewSet(BaseViewSet):
"external_id",
"progress_snapshot",
"logo_props",
"version",
# meta fields
"is_favorite",
"total_issues",
@@ -265,6 +295,15 @@ class CycleViewSet(BaseViewSet):
)
.first()
)
datetime_fields = [
"created_at",
"updated_at",
"start_date",
"end_date",
]
cycle = user_timezone_converter(
cycle, datetime_fields, request.user.user_timezone
)
# Send the model activity
model_activity.delay(
@@ -306,10 +345,7 @@ class CycleViewSet(BaseViewSet):
request_data = request.data
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if cycle.end_date is not None and cycle.end_date < timezone.now():
if "sort_order" in request_data:
# Can only change sort order for a completed cycle``
request_data = {
@@ -347,6 +383,7 @@ class CycleViewSet(BaseViewSet):
"external_id",
"progress_snapshot",
"logo_props",
"version",
# meta fields
"is_favorite",
"total_issues",
@@ -356,6 +393,16 @@ class CycleViewSet(BaseViewSet):
"created_by",
).first()
datetime_fields = [
"created_at",
"updated_at",
"start_date",
"end_date",
]
cycle = user_timezone_converter(
cycle, datetime_fields, request.user.user_timezone
)
# Send the model activity
model_activity.delay(
model_name="cycle",
@@ -389,6 +436,7 @@ class CycleViewSet(BaseViewSet):
project_id=self.kwargs.get("project_id"),
parent__isnull=False,
issue_cycle__cycle_id=pk,
issue_cycle__deleted_at__isnull=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -412,6 +460,7 @@ class CycleViewSet(BaseViewSet):
"progress_snapshot",
"sub_issues",
"logo_props",
"version",
# meta fields
"is_favorite",
"total_issues",
@@ -429,6 +478,16 @@ class CycleViewSet(BaseViewSet):
status=status.HTTP_404_NOT_FOUND,
)
datetime_fields = [
"created_at",
"updated_at",
"start_date",
"end_date",
]
data = user_timezone_converter(
data, datetime_fields, request.user.user_timezone
)
queryset = queryset.first()
recent_visited_task.delay(
@@ -485,12 +544,9 @@ class CycleViewSet(BaseViewSet):
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
# Delete the cycle
# TODO: Soft delete the cycle break the onetoone relationship with cycle issue
cycle.delete()
# Delete the cycle issues
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk"),
).delete()
# Delete the user favorite cycle
UserFavorite.objects.filter(
user=request.user,
@@ -594,6 +650,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -604,6 +661,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -614,6 +673,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -624,6 +685,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -634,6 +697,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -644,6 +709,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -660,13 +727,33 @@ class TransferCycleIssueEndpoint(BaseAPIView):
assignee_estimate_data = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
@@ -703,7 +790,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
if item["assignee_id"]
else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar"),
"avatar_url": item.get("avatar_url"),
"total_estimates": item["total_estimates"],
"completed_estimates": item["completed_estimates"],
"pending_estimates": item["pending_estimates"],
@@ -714,6 +802,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
label_distribution_data = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -775,13 +864,33 @@ class TransferCycleIssueEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_issues=Count(
"id",
@@ -820,7 +929,8 @@ class TransferCycleIssueEndpoint(BaseAPIView):
"assignee_id": (
str(item["assignee_id"]) if item["assignee_id"] else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar"),
"avatar_url": item.get("avatar_url"),
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
@@ -832,6 +942,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -925,7 +1036,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
if (
new_cycle.end_date is not None
and new_cycle.end_date < timezone.now().date()
and new_cycle.end_date < timezone.now()
):
return Response(
{
@@ -1022,6 +1133,7 @@ class CycleProgressEndpoint(BaseAPIView):
Issue.issue_objects.filter(
estimate_point__estimate__type="points",
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -1074,6 +1186,7 @@ class CycleProgressEndpoint(BaseAPIView):
backlog_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="backlog",
@@ -1081,6 +1194,7 @@ class CycleProgressEndpoint(BaseAPIView):
unstarted_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="unstarted",
@@ -1088,6 +1202,7 @@ class CycleProgressEndpoint(BaseAPIView):
started_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="started",
@@ -1095,6 +1210,7 @@ class CycleProgressEndpoint(BaseAPIView):
cancelled_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="cancelled",
@@ -1102,6 +1218,7 @@ class CycleProgressEndpoint(BaseAPIView):
completed_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
state__group="completed",
@@ -1109,6 +1226,7 @@ class CycleProgressEndpoint(BaseAPIView):
total_issues = Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
).count()
@@ -1148,6 +1266,7 @@ class CycleProgressEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
class CycleAnalyticsEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
@@ -1166,6 +1285,8 @@ class CycleAnalyticsEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -1193,13 +1314,33 @@ class CycleAnalyticsEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
@@ -1231,6 +1372,7 @@ class CycleAnalyticsEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -1277,13 +1419,33 @@ class CycleAnalyticsEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
project_id=project_id,
workspace__slug=slug,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_issues=Count(
"assignee_id",
@@ -1316,6 +1478,7 @@ class CycleAnalyticsEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
project_id=project_id,
workspace__slug=slug,
)

View File

@@ -3,7 +3,7 @@ import json
# Django imports
from django.core import serializers
from django.db.models import F, Func, OuterRef, Q
from django.db.models import F, Func, OuterRef, Q, Subquery
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
@@ -12,6 +12,7 @@ from django.views.decorators.gzip import gzip_page
from rest_framework import status
from rest_framework.response import Response
# Module imports
from .. import BaseViewSet
from plane.app.serializers import (
@@ -22,7 +23,7 @@ from plane.db.models import (
Cycle,
CycleIssue,
Issue,
IssueAttachment,
FileAsset,
IssueLink,
)
from plane.utils.grouper import (
@@ -39,6 +40,7 @@ from plane.utils.paginator import (
from plane.app.permissions import allow_permission, ROLE
class CycleIssueViewSet(BaseViewSet):
serializer_class = CycleIssueSerializer
model = CycleIssue
@@ -90,7 +92,10 @@ class CycleIssueViewSet(BaseViewSet):
order_by_param = request.GET.get("order_by", "created_at")
filters = issue_filters(request.query_params, "GET")
issue_queryset = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
)
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.filter(**filters)
@@ -102,7 +107,13 @@ class CycleIssueViewSet(BaseViewSet):
"issue_cycle__cycle",
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -110,8 +121,9 @@ class CycleIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -184,10 +196,10 @@ class CycleIssueViewSet(BaseViewSet):
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -213,10 +225,10 @@ class CycleIssueViewSet(BaseViewSet):
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -246,10 +258,7 @@ class CycleIssueViewSet(BaseViewSet):
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if cycle.end_date is not None and cycle.end_date < timezone.now():
return Response(
{
"error": "The Cycle has already been completed so no new issues can be added"

View File

@@ -36,14 +36,13 @@ from plane.db.models import (
DashboardWidget,
Issue,
IssueActivity,
IssueAttachment,
FileAsset,
IssueLink,
IssueRelation,
Project,
ProjectMember,
User,
Widget,
WorkspaceMember,
CycleIssue,
)
from plane.utils.issue_filters import issue_filters
@@ -58,7 +57,8 @@ def dashboard_overview_stats(self, request, slug):
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
).filter(
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
@@ -85,7 +85,8 @@ def dashboard_overview_stats(self, request, slug):
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
).filter(
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
@@ -110,7 +111,8 @@ def dashboard_overview_stats(self, request, slug):
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
created_by_id=request.user.id,
).filter(
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
@@ -136,7 +138,8 @@ def dashboard_overview_stats(self, request, slug):
project__project_projectmember__member=request.user,
assignees__in=[request.user],
state__group="completed",
).filter(
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
@@ -189,7 +192,13 @@ def dashboard_assigned_issues(self, request, slug):
).select_related("issue"),
)
)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -197,8 +206,9 @@ def dashboard_assigned_issues(self, request, slug):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -215,7 +225,10 @@ def dashboard_assigned_issues(self, request, slug):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -223,8 +236,11 @@ def dashboard_assigned_issues(self, request, slug):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -232,7 +248,11 @@ def dashboard_assigned_issues(self, request, slug):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -352,7 +372,13 @@ def dashboard_created_issues(self, request, slug):
.filter(**filters)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -360,8 +386,9 @@ def dashboard_created_issues(self, request, slug):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -378,7 +405,10 @@ def dashboard_created_issues(self, request, slug):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -386,8 +416,11 @@ def dashboard_created_issues(self, request, slug):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -395,7 +428,11 @@ def dashboard_created_issues(self, request, slug):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),

View File

@@ -1,5 +1,9 @@
import random
import string
import json
# Django imports
from django.utils import timezone
# Third party imports
from rest_framework.response import Response
@@ -19,6 +23,7 @@ from plane.app.serializers import (
EstimateReadSerializer,
)
from plane.utils.cache import invalidate_cache
from plane.bgtasks.issue_activities_task import issue_activity
def generate_random_name(length=10):
@@ -249,11 +254,66 @@ class EstimatePointEndpoint(BaseViewSet):
)
# update all the issues with the new estimate
if new_estimate_id:
_ = Issue.objects.filter(
issues = Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
estimate_point_id=estimate_point_id,
).update(estimate_point_id=new_estimate_id)
)
for issue in issues:
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{
"estimate_point": (
str(new_estimate_id)
if new_estimate_id
else None
),
}
),
actor_id=str(request.user.id),
issue_id=issue.id,
project_id=str(project_id),
current_instance=json.dumps(
{
"estimate_point": (
str(issue.estimate_point_id)
if issue.estimate_point_id
else None
),
}
),
epoch=int(timezone.now().timestamp()),
)
issues.update(estimate_point_id=new_estimate_id)
else:
issues = Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
estimate_point_id=estimate_point_id,
)
for issue in issues:
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{
"estimate_point": None,
}
),
actor_id=str(request.user.id),
issue_id=issue.id,
project_id=str(project_id),
current_instance=json.dumps(
{
"estimate_point": (
str(issue.estimate_point_id)
if issue.estimate_point_id
else None
),
}
),
epoch=int(timezone.now().timestamp()),
)
# delete the estimate point
old_estimate_point = EstimatePoint.objects.filter(

View File

@@ -3,7 +3,7 @@ import json
# Django import
from django.utils import timezone
from django.db.models import Q, Count, OuterRef, Func, F, Prefetch
from django.db.models import Q, Count, OuterRef, Func, F, Prefetch, Subquery
from django.core.serializers.json import DjangoJSONEncoder
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
@@ -18,30 +18,31 @@ from rest_framework.response import Response
from ..base import BaseViewSet
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import (
Inbox,
InboxIssue,
Intake,
IntakeIssue,
Issue,
State,
IssueLink,
IssueAttachment,
FileAsset,
Project,
ProjectMember,
CycleIssue,
)
from plane.app.serializers import (
IssueCreateSerializer,
IssueSerializer,
InboxSerializer,
InboxIssueSerializer,
InboxIssueDetailSerializer,
IntakeSerializer,
IntakeIssueSerializer,
IntakeIssueDetailSerializer,
)
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.issue_activities_task import issue_activity
class InboxViewSet(BaseViewSet):
class IntakeViewSet(BaseViewSet):
serializer_class = InboxSerializer
model = Inbox
serializer_class = IntakeSerializer
model = Intake
def get_queryset(self):
return (
@@ -53,8 +54,8 @@ class InboxViewSet(BaseViewSet):
)
.annotate(
pending_issue_count=Count(
"issue_inbox",
filter=Q(issue_inbox__status=-2),
"issue_intake",
filter=Q(issue_intake__status=-2),
)
)
.select_related("workspace", "project")
@@ -62,9 +63,9 @@ class InboxViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def list(self, request, slug, project_id):
inbox = self.get_queryset().first()
intake = self.get_queryset().first()
return Response(
InboxSerializer(inbox).data,
IntakeSerializer(intake).data,
status=status.HTTP_200_OK,
)
@@ -74,26 +75,26 @@ class InboxViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id, pk=pk
).first()
# Handle default inbox delete
if inbox.is_default:
# Handle default intake delete
if intake.is_default:
return Response(
{"error": "You cannot delete the default inbox"},
{"error": "You cannot delete the default intake"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox.delete()
intake.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class InboxIssueViewSet(BaseViewSet):
class IntakeIssueViewSet(BaseViewSet):
serializer_class = InboxIssueSerializer
model = InboxIssue
serializer_class = IntakeIssueSerializer
model = IntakeIssue
filterset_fields = [
"status",
"statulls",
]
def get_queryset(self):
@@ -106,13 +107,19 @@ class InboxIssueViewSet(BaseViewSet):
.prefetch_related("assignees", "labels", "issue_module__module")
.prefetch_related(
Prefetch(
"issue_inbox",
queryset=InboxIssue.objects.only(
"issue_intake",
queryset=IntakeIssue.objects.only(
"status", "duplicate_to", "snoozed_till", "source"
),
)
)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -120,8 +127,9 @@ class InboxIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -140,7 +148,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -148,8 +159,11 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -157,8 +171,11 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -167,14 +184,14 @@ class InboxIssueViewSet(BaseViewSet):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
inbox_id = Inbox.objects.filter(
intake_id = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(pk=project_id)
filters = issue_filters(request.GET, "GET", "issue__")
inbox_issue = (
InboxIssue.objects.filter(
inbox_id=inbox_id.id, project_id=project_id, **filters
intake_issue = (
IntakeIssue.objects.filter(
intake_id=intake_id.id, project_id=project_id, **filters
)
.select_related("issue")
.prefetch_related(
@@ -185,20 +202,23 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(issue__label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
)
)
).order_by(request.GET.get("order_by", "-issue__created_at"))
# inbox status filter
inbox_status = [
# Intake status filter
intake_status = [
item
for item in request.GET.get("status", "-2").split(",")
if item != "null"
]
if inbox_status:
inbox_issue = inbox_issue.filter(status__in=inbox_status)
if intake_status:
intake_issue = intake_issue.filter(status__in=intake_status)
if (
ProjectMember.objects.filter(
@@ -210,12 +230,12 @@ class InboxIssueViewSet(BaseViewSet):
).exists()
and not project.guest_view_all_features
):
inbox_issue = inbox_issue.filter(created_by=request.user)
intake_issue = intake_issue.filter(created_by=request.user)
return self.paginate(
request=request,
queryset=(inbox_issue),
on_results=lambda inbox_issues: InboxIssueSerializer(
inbox_issues,
queryset=(intake_issue),
on_results=lambda intake_issues: IntakeIssueSerializer(
intake_issues,
many=True,
).data,
)
@@ -241,16 +261,6 @@ class InboxIssueViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST,
)
# Create or get state
state, _ = State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Inbox Issues",
project_id=project_id,
color="#ff7700",
is_triage=True,
)
# create an issue
project = Project.objects.get(pk=project_id)
serializer = IssueCreateSerializer(
@@ -263,15 +273,15 @@ class InboxIssueViewSet(BaseViewSet):
)
if serializer.is_valid():
serializer.save()
inbox_id = Inbox.objects.filter(
intake_id = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
# create an inbox issue
inbox_issue = InboxIssue.objects.create(
inbox_id=inbox_id.id,
# create an intake issue
intake_issue = IntakeIssue.objects.create(
intake_id=intake_id.id,
project_id=project_id,
issue_id=serializer.data["id"],
source=request.data.get("source", "in-app"),
source=request.data.get("source", "IN-APP"),
)
# Create an Issue Activity
issue_activity.delay(
@@ -284,10 +294,10 @@ class InboxIssueViewSet(BaseViewSet):
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
inbox=str(inbox_issue.id),
intake=str(intake_issue.id),
)
inbox_issue = (
InboxIssue.objects.select_related("issue")
intake_issue = (
IntakeIssue.objects.select_related("issue")
.prefetch_related(
"issue__labels",
"issue__assignees",
@@ -297,7 +307,12 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(
issue__label_issue__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -305,34 +320,37 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__assignees__id",
distinct=True,
filter=~Q(issue__assignees__id__isnull=True),
filter=~Q(issue__assignees__id__isnull=True)
& Q(
issue__assignees__member_project__is_active=True
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.get(
inbox_id=inbox_id.id,
intake_id=intake_id.id,
issue_id=serializer.data["id"],
project_id=project_id,
)
)
serializer = InboxIssueDetailSerializer(inbox_issue)
serializer = IntakeIssueDetailSerializer(intake_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
else:
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
@allow_permission(allowed_roles=[ROLE.ADMIN], creator=True, model=Issue)
def partial_update(self, request, slug, project_id, pk):
inbox_id = Inbox.objects.filter(
intake_id = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
inbox_issue = InboxIssue.objects.get(
intake_issue = IntakeIssue.objects.get(
issue_id=pk,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox_id,
intake_id=intake_id,
)
# Get the project member
project_member = ProjectMember.objects.get(
@@ -342,11 +360,11 @@ class InboxIssueViewSet(BaseViewSet):
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 5 and str(inbox_issue.created_by_id) != str(
if project_member.role <= 5 and str(intake_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
{"error": "You cannot edit intake issues"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -358,7 +376,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -366,12 +387,15 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
).get(
pk=inbox_issue.issue_id,
pk=intake_issue.issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -409,7 +433,7 @@ class InboxIssueViewSet(BaseViewSet):
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
inbox=str(inbox_issue.id),
intake=str(intake_issue.id),
)
issue_serializer.save()
else:
@@ -417,20 +441,20 @@ class InboxIssueViewSet(BaseViewSet):
issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
# Only project admins and members can edit inbox issue attributes
if project_member.role > 5:
serializer = InboxIssueSerializer(
inbox_issue, data=request.data, partial=True
# Only project admins and members can edit intake issue attributes
if project_member.role > 15:
serializer = IntakeIssueSerializer(
intake_issue, data=request.data, partial=True
)
current_instance = json.dumps(
InboxIssueSerializer(inbox_issue).data, cls=DjangoJSONEncoder
IntakeIssueSerializer(intake_issue).data, cls=DjangoJSONEncoder
)
if serializer.is_valid():
serializer.save()
# Update the issue state if the issue is rejected or marked as duplicate
if serializer.data["status"] in [-1, 2]:
issue = Issue.objects.get(
pk=inbox_issue.issue_id,
pk=intake_issue.issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -446,7 +470,7 @@ class InboxIssueViewSet(BaseViewSet):
# Update the issue state if it is accepted
if serializer.data["status"] in [1]:
issue = Issue.objects.get(
pk=inbox_issue.issue_id,
pk=intake_issue.issue_id,
workspace__slug=slug,
project_id=project_id,
)
@@ -464,7 +488,7 @@ class InboxIssueViewSet(BaseViewSet):
issue.save()
# create a activity for status change
issue_activity.delay(
type="inbox.activity.created",
type="intake.activity.created",
requested_data=json.dumps(
request.data, cls=DjangoJSONEncoder
),
@@ -475,11 +499,11 @@ class InboxIssueViewSet(BaseViewSet):
epoch=int(timezone.now().timestamp()),
notification=False,
origin=request.META.get("HTTP_ORIGIN"),
inbox=(inbox_issue.id),
intake=(intake_issue.id),
)
inbox_issue = (
InboxIssue.objects.select_related("issue")
intake_issue = (
IntakeIssue.objects.select_related("issue")
.prefetch_related(
"issue__labels",
"issue__assignees",
@@ -489,7 +513,12 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(
issue__label_issue__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -497,24 +526,29 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__assignees__id",
distinct=True,
filter=~Q(issue__assignees__id__isnull=True),
filter=Q(
~Q(issue__assignees__id__isnull=True)
& Q(
issue__issue_assignee__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.get(
inbox_id=inbox_id.id,
intake_id=intake_id.id,
issue_id=pk,
project_id=project_id,
)
)
serializer = InboxIssueDetailSerializer(inbox_issue).data
serializer = IntakeIssueDetailSerializer(intake_issue).data
return Response(serializer, status=status.HTTP_200_OK)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else:
serializer = InboxIssueDetailSerializer(inbox_issue).data
serializer = IntakeIssueDetailSerializer(intake_issue).data
return Response(serializer, status=status.HTTP_200_OK)
@allow_permission(
@@ -527,12 +561,12 @@ class InboxIssueViewSet(BaseViewSet):
model=Issue,
)
def retrieve(self, request, slug, project_id, pk):
inbox_id = Inbox.objects.filter(
intake_id = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(pk=project_id)
inbox_issue = (
InboxIssue.objects.select_related("issue")
intake_issue = (
IntakeIssue.objects.select_related("issue")
.prefetch_related(
"issue__labels",
"issue__assignees",
@@ -542,7 +576,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(issue__label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -550,12 +587,15 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__assignees__id",
distinct=True,
filter=~Q(issue__assignees__id__isnull=True),
filter=Q(
~Q(issue__assignees__id__isnull=True)
& Q(issue__issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.get(inbox_id=inbox_id.id, issue_id=pk, project_id=project_id)
.get(intake_id=intake_id.id, issue_id=pk, project_id=project_id)
)
if (
ProjectMember.objects.filter(
@@ -566,13 +606,13 @@ class InboxIssueViewSet(BaseViewSet):
is_active=True,
).exists()
and not project.guest_view_all_features
and not inbox_issue.created_by == request.user
and not intake_issue.created_by == request.user
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
)
issue = InboxIssueDetailSerializer(inbox_issue).data
issue = IntakeIssueDetailSerializer(intake_issue).data
return Response(
issue,
status=status.HTTP_200_OK,
@@ -580,23 +620,23 @@ class InboxIssueViewSet(BaseViewSet):
@allow_permission(allowed_roles=[ROLE.ADMIN], creator=True, model=Issue)
def destroy(self, request, slug, project_id, pk):
inbox_id = Inbox.objects.filter(
intake_id = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
inbox_issue = InboxIssue.objects.get(
intake_issue = IntakeIssue.objects.get(
issue_id=pk,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox_id,
intake_id=intake_id,
)
# Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]:
if intake_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
issue = Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=pk
).first()
issue.delete()
inbox_issue.delete()
intake_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -3,14 +3,7 @@ import json
# Django imports
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
F,
Func,
OuterRef,
Q,
Prefetch,
Exists,
)
from django.db.models import F, Func, OuterRef, Q, Prefetch, Exists, Subquery
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
@@ -30,10 +23,11 @@ from plane.app.serializers import (
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
IssueSubscriber,
IssueReaction,
CycleIssue
)
from plane.utils.grouper import (
issue_group_values,
@@ -71,7 +65,13 @@ class IssueArchiveViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -79,8 +79,9 @@ class IssueArchiveViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -170,10 +171,10 @@ class IssueArchiveViewSet(BaseViewSet):
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -199,10 +200,10 @@ class IssueArchiveViewSet(BaseViewSet):
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -236,12 +237,6 @@ class IssueArchiveViewSet(BaseViewSet):
),
)
)
.prefetch_related(
Prefetch(
"issue_attachment",
queryset=IssueAttachment.objects.select_related("issue"),
)
)
.prefetch_related(
Prefetch(
"issue_link",

View File

@@ -1,9 +1,12 @@
# Python imports
import json
import uuid
# Django imports
from django.utils import timezone
from django.core.serializers.json import DjangoJSONEncoder
from django.conf import settings
from django.http import HttpResponseRedirect
# Third Party imports
from rest_framework.response import Response
@@ -13,21 +16,29 @@ from rest_framework.parsers import MultiPartParser, FormParser
# Module imports
from .. import BaseAPIView
from plane.app.serializers import IssueAttachmentSerializer
from plane.db.models import IssueAttachment
from plane.db.models import FileAsset, Workspace
from plane.bgtasks.issue_activities_task import issue_activity
from plane.app.permissions import allow_permission, ROLE
from plane.settings.storage import S3Storage
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
class IssueAttachmentEndpoint(BaseAPIView):
serializer_class = IssueAttachmentSerializer
model = IssueAttachment
model = FileAsset
parser_classes = (MultiPartParser, FormParser)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def post(self, request, slug, project_id, issue_id):
serializer = IssueAttachmentSerializer(data=request.data)
workspace = Workspace.objects.get(slug=slug)
if serializer.is_valid():
serializer.save(project_id=project_id, issue_id=issue_id)
serializer.save(
project_id=project_id,
issue_id=issue_id,
workspace_id=workspace.id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
issue_activity.delay(
type="attachment.activity.created",
requested_data=None,
@@ -45,9 +56,9 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission([ROLE.ADMIN], creator=True, model=IssueAttachment)
@allow_permission([ROLE.ADMIN], creator=True, model=FileAsset)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = IssueAttachment.objects.get(pk=pk)
issue_attachment = FileAsset.objects.get(pk=pk)
issue_attachment.asset.delete(save=False)
issue_attachment.delete()
issue_activity.delay(
@@ -72,8 +83,180 @@ class IssueAttachmentEndpoint(BaseAPIView):
]
)
def get(self, request, slug, project_id, issue_id):
issue_attachments = IssueAttachment.objects.filter(
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachments, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueAttachmentV2Endpoint(BaseAPIView):
serializer_class = IssueAttachmentSerializer
model = FileAsset
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def post(self, request, slug, project_id, issue_id):
name = request.data.get("name")
type = request.data.get("type", False)
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
if not type or type not in settings.ATTACHMENT_MIME_TYPES:
return Response(
{
"error": "Invalid file type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Get the size limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
workspace_id=workspace.id,
created_by=request.user,
issue_id=issue_id,
project_id=project_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"attachment": IssueAttachmentSerializer(asset).data,
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@allow_permission([ROLE.ADMIN], creator=True, model=FileAsset)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = FileAsset.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
issue_attachment.is_deleted = True
issue_attachment.deleted_at = timezone.now()
issue_attachment.save()
issue_activity.delay(
type="attachment.activity.deleted",
requested_data=None,
actor_id=str(self.request.user.id),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def get(self, request, slug, project_id, issue_id, pk=None):
if pk:
# Get the asset
asset = FileAsset.objects.get(
id=pk, workspace__slug=slug, project_id=project_id
)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The asset is not uploaded.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
storage = S3Storage(request=request)
presigned_url = storage.generate_presigned_url(
object_name=asset.asset.name,
disposition="attachment",
filename=asset.attributes.get("name"),
)
return HttpResponseRedirect(presigned_url)
# Get all the attachments
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
workspace__slug=slug,
project_id=project_id,
is_uploaded=True,
)
# Serialize the attachments
serializer = IssueAttachmentSerializer(issue_attachments, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def patch(self, request, slug, project_id, issue_id, pk):
issue_attachment = FileAsset.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachment)
# Send this activity only if the attachment is not uploaded before
if not issue_attachment.is_uploaded:
issue_activity.delay(
type="attachment.activity.created",
requested_data=None,
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id", None)),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
serializer.data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
# Update the attachment
issue_attachment.is_uploaded = True
issue_attachment.created_by = request.user
# Get the storage metadata
if not issue_attachment.storage_metadata:
get_asset_object_metadata.delay(str(issue_attachment.id))
issue_attachment.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -14,6 +14,9 @@ from django.db.models import (
Q,
UUIDField,
Value,
Subquery,
Case,
When,
)
from django.db.models.functions import Coalesce
from django.utils import timezone
@@ -35,13 +38,14 @@ from plane.app.serializers import (
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
IssueUserProperty,
IssueReaction,
IssueSubscriber,
Project,
ProjectMember,
CycleIssue,
)
from plane.utils.grouper import (
issue_group_values,
@@ -83,7 +87,13 @@ class IssueListEndpoint(BaseAPIView):
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -91,8 +101,9 @@ class IssueListEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -206,7 +217,13 @@ class IssueViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -214,8 +231,9 @@ class IssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -318,10 +336,10 @@ class IssueViewSet(BaseViewSet):
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -346,10 +364,10 @@ class IssueViewSet(BaseViewSet):
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -462,14 +480,54 @@ class IssueViewSet(BaseViewSet):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
issue = (
self.get_queryset()
Issue.objects.filter(
project_id=self.kwargs.get("project_id")
)
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(
cycle_id=Case(
When(
issue_cycle__cycle__deleted_at__isnull=True,
then=F("issue_cycle__cycle_id"),
),
default=None,
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.filter(pk=pk)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -477,8 +535,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -486,8 +547,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -500,12 +564,6 @@ class IssueViewSet(BaseViewSet):
),
)
)
.prefetch_related(
Prefetch(
"issue_attachment",
queryset=IssueAttachment.objects.select_related("issue"),
)
)
.prefetch_related(
Prefetch(
"issue_link",
@@ -572,7 +630,10 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -580,8 +641,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -589,7 +653,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -741,23 +809,24 @@ class DeletedIssuesListViewSet(BaseAPIView):
class IssuePaginatedViewSet(BaseViewSet):
def get_queryset(self):
workspace_slug = self.kwargs.get("slug")
# getting the project_id from the request params
project_id = self.request.GET.get("project_id", None)
project_id = self.kwargs.get("project_id")
issue_queryset = Issue.issue_objects.filter(
workspace__slug=workspace_slug
workspace__slug=workspace_slug, project_id=project_id
)
if project_id:
issue_queryset = issue_queryset.filter(project_id=project_id)
return (
issue_queryset.select_related(
"workspace", "project", "state", "parent"
)
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -765,8 +834,9 @@ class IssuePaginatedViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -793,10 +863,10 @@ class IssuePaginatedViewSet(BaseViewSet):
return paginated_data
def list(self, request, slug):
project_id = self.request.GET.get("project_id", None)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
cursor = request.GET.get("cursor", None)
is_description_required = request.GET.get("description", False)
is_description_required = request.GET.get("description", "false")
updated_at = request.GET.get("updated_at__gt", None)
# required fields
@@ -829,18 +899,30 @@ class IssuePaginatedViewSet(BaseViewSet):
"sub_issues_count",
]
if is_description_required:
if str(is_description_required).lower() == "true":
required_fields.append("description_html")
# querying issues
base_queryset = Issue.issue_objects.filter(workspace__slug=slug)
if project_id:
base_queryset = base_queryset.filter(project_id=project_id)
base_queryset = Issue.issue_objects.filter(
workspace__slug=slug, project_id=project_id
)
base_queryset = base_queryset.order_by("updated_at")
queryset = self.get_queryset().order_by("updated_at")
# validation for guest user
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
)
if project_member.exists() and not project.guest_view_all_features:
base_queryset = base_queryset.filter(created_by=request.user)
queryset = queryset.filter(created_by=request.user)
# filtering issues by greater then updated_at given by the user
if updated_at:
base_queryset = base_queryset.filter(updated_at__gt=updated_at)
@@ -851,7 +933,10 @@ class IssuePaginatedViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -859,8 +944,11 @@ class IssuePaginatedViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -868,8 +956,11 @@ class IssuePaginatedViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -885,3 +976,194 @@ class IssuePaginatedViewSet(BaseViewSet):
)
return Response(paginated_data, status=status.HTTP_200_OK)
class IssueDetailEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
issue = (
Issue.issue_objects.filter(
workspace__slug=slug, project_id=project_id
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
issue = issue.filter(**filters)
order_by_param = request.GET.get("order_by", "-created_at")
# Issue queryset
issue, order_by_param = order_issue_queryset(
issue_queryset=issue,
order_by_param=order_by_param,
)
return self.paginate(
request=request,
order_by=order_by_param,
queryset=(issue),
on_results=lambda issue: IssueSerializer(
issue,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
class IssueBulkUpdateDateEndpoint(BaseAPIView):
def validate_dates(
self, current_start, current_target, new_start, new_target
):
"""
Validate that start date is before target date.
"""
start = new_start or current_start
target = new_target or current_target
if start and target and start > target:
return False
return True
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def post(self, request, slug, project_id):
updates = request.data.get("updates", [])
issue_ids = [update["id"] for update in updates]
epoch = int(timezone.now().timestamp())
# Fetch all relevant issues in a single query
issues = list(Issue.objects.filter(id__in=issue_ids))
issues_dict = {str(issue.id): issue for issue in issues}
issues_to_update = []
for update in updates:
issue_id = update["id"]
issue = issues_dict.get(issue_id)
if not issue:
continue
start_date = update.get("start_date")
target_date = update.get("target_date")
validate_dates = self.validate_dates(
issue.start_date, issue.target_date, start_date, target_date
)
if not validate_dates:
return Response(
{
"message": "Start date cannot exceed target date",
},
status=status.HTTP_400_BAD_REQUEST,
)
if start_date:
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{"start_date": update.get("start_date")}
),
current_instance=json.dumps(
{"start_date": str(issue.start_date)}
),
issue_id=str(issue_id),
actor_id=str(request.user.id),
project_id=str(project_id),
epoch=epoch,
)
issue.start_date = start_date
issues_to_update.append(issue)
if target_date:
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{"target_date": update.get("target_date")}
),
current_instance=json.dumps(
{"target_date": str(issue.target_date)}
),
issue_id=str(issue_id),
actor_id=str(request.user.id),
project_id=str(project_id),
epoch=epoch,
)
issue.target_date = target_date
issues_to_update.append(issue)
# Bulk update issues
Issue.objects.bulk_update(
issues_to_update, ["start_date", "target_date"]
)
return Response(
{"message": "Issues updated successfully"},
status=status.HTTP_200_OK,
)

View File

@@ -1,410 +0,0 @@
# Python imports
import json
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
Exists,
F,
Func,
OuterRef,
Prefetch,
Q,
UUIDField,
Value,
)
from django.db.models.functions import Coalesce
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
# Third Party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import ProjectEntityPermission
from plane.app.serializers import (
IssueCreateSerializer,
IssueDetailSerializer,
IssueFlatSerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
IssueLink,
IssueReaction,
IssueSubscriber,
Project,
ProjectMember,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from .. import BaseViewSet
class IssueDraftViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]
serializer_class = IssueFlatSerializer
model = Issue
def get_queryset(self):
return (
Issue.objects.filter(project_id=self.kwargs.get("project_id"))
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(is_draft=True)
.filter(deleted_at__isnull=True)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
).distinct()
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
serializer = IssueCreateSerializer(
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
"default_assignee_id": project.default_assignee_id,
},
)
if serializer.is_valid():
serializer.save(is_draft=True)
# Track the issue
issue_activity.delay(
type="issue_draft.activity.created",
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
issue = (
issue_queryset_grouper(
queryset=self.get_queryset().filter(
pk=serializer.data["id"]
),
group_by=None,
sub_group_by=None,
)
.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
)
.first()
)
return Response(issue, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
issue = self.get_queryset().filter(pk=pk).first()
if not issue:
return Response(
{"error": "Issue does not exist"},
status=status.HTTP_404_NOT_FOUND,
)
serializer = IssueCreateSerializer(
issue, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
type="issue_draft.activity.updated",
requested_data=json.dumps(request.data, cls=DjangoJSONEncoder),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
IssueSerializer(issue).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, pk=None):
issue = (
self.get_queryset()
.filter(pk=pk)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related(
"issue", "actor"
),
)
)
.prefetch_related(
Prefetch(
"issue_attachment",
queryset=IssueAttachment.objects.select_related("issue"),
)
)
.prefetch_related(
Prefetch(
"issue_link",
queryset=IssueLink.objects.select_related("created_by"),
)
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
workspace__slug=slug,
project_id=project_id,
issue_id=OuterRef("pk"),
subscriber=request.user,
)
)
)
).first()
if not issue:
return Response(
{"error": "The required object does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
serializer = IssueDetailSerializer(issue, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
if issue.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=20,
project_id=project_id,
is_active=True,
).exists()
):
return Response(
{"error": "Only admin or creator can delete the issue"},
status=status.HTTP_403_FORBIDDEN,
)
issue.delete()
issue_activity.delay(
type="issue_draft.activity.deleted",
requested_data=json.dumps({"issue_id": str(pk)}),
actor_id=str(request.user.id),
issue_id=str(pk),
project_id=str(project_id),
current_instance={},
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -3,7 +3,16 @@ import json
# Django imports
from django.utils import timezone
from django.db.models import Q, OuterRef, F, Func, UUIDField, Value, CharField
from django.db.models import (
Q,
OuterRef,
F,
Func,
UUIDField,
Value,
CharField,
Subquery,
)
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models.functions import Coalesce
from django.contrib.postgres.aggregates import ArrayAgg
@@ -24,10 +33,12 @@ from plane.db.models import (
Project,
IssueRelation,
Issue,
IssueAttachment,
FileAsset,
IssueLink,
CycleIssue,
)
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.issue_relation_mapper import get_actual_relation
class IssueRelationViewSet(BaseViewSet):
@@ -79,11 +90,37 @@ class IssueRelationViewSet(BaseViewSet):
related_issue_id=issue_id, relation_type="relates_to"
).values_list("issue_id", flat=True)
# get all start after issues
start_after_issues = issue_relations.filter(
relation_type="start_before", related_issue_id=issue_id
).values_list("issue_id", flat=True)
# get all start_before issues
start_before_issues = issue_relations.filter(
relation_type="start_before", issue_id=issue_id
).values_list("related_issue_id", flat=True)
# get all finish after issues
finish_after_issues = issue_relations.filter(
relation_type="finish_before", related_issue_id=issue_id
).values_list("issue_id", flat=True)
# get all finish before issues
finish_before_issues = issue_relations.filter(
relation_type="finish_before", issue_id=issue_id
).values_list("related_issue_id", flat=True)
queryset = (
Issue.issue_objects.filter(workspace__slug=slug)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -91,8 +128,9 @@ class IssueRelationViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -111,7 +149,10 @@ class IssueRelationViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& (Q(label_issue__deleted_at__isnull=True))
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -119,8 +160,11 @@ class IssueRelationViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -188,12 +232,50 @@ class IssueRelationViewSet(BaseViewSet):
)
)
.values(*fields),
"start_after": queryset.filter(pk__in=start_after_issues)
.annotate(
relation_type=Value(
"start_after",
output_field=CharField(),
)
)
.values(*fields),
"start_before": queryset.filter(pk__in=start_before_issues)
.annotate(
relation_type=Value(
"start_before",
output_field=CharField(),
)
)
.values(*fields),
"finish_after": queryset.filter(pk__in=finish_after_issues)
.annotate(
relation_type=Value(
"finish_after",
output_field=CharField(),
)
)
.values(*fields),
"finish_before": queryset.filter(pk__in=finish_before_issues)
.annotate(
relation_type=Value(
"finish_before",
output_field=CharField(),
)
)
.values(*fields),
}
return Response(response_data, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, issue_id):
relation_type = request.data.get("relation_type", None)
if relation_type is None:
return Response(
{"message": "Issue relation type is required"},
status=status.HTTP_400_BAD_REQUEST,
)
issues = request.data.get("issues", [])
project = Project.objects.get(pk=project_id)
@@ -201,16 +283,18 @@ class IssueRelationViewSet(BaseViewSet):
[
IssueRelation(
issue_id=(
issue if relation_type == "blocking" else issue_id
issue
if relation_type
in ["blocking", "start_after", "finish_after"]
else issue_id
),
related_issue_id=(
issue_id if relation_type == "blocking" else issue
),
relation_type=(
"blocked_by"
if relation_type == "blocking"
else relation_type
issue_id
if relation_type
in ["blocking", "start_after", "finish_after"]
else issue
),
relation_type=(get_actual_relation(relation_type)),
project_id=project_id,
workspace_id=project.workspace_id,
created_by=request.user,
@@ -234,7 +318,7 @@ class IssueRelationViewSet(BaseViewSet):
origin=request.META.get("HTTP_ORIGIN"),
)
if relation_type == "blocking":
if relation_type in ["blocking", "start_after", "finish_after"]:
return Response(
RelatedIssueSerializer(issue_relation, many=True).data,
status=status.HTTP_201_CREATED,
@@ -249,7 +333,7 @@ class IssueRelationViewSet(BaseViewSet):
relation_type = request.data.get("relation_type", None)
related_issue = request.data.get("related_issue", None)
if relation_type == "blocking":
if relation_type in ["blocking", "start_after", "finish_after"]:
issue_relation = IssueRelation.objects.get(
workspace__slug=slug,
project_id=project_id,
@@ -267,7 +351,7 @@ class IssueRelationViewSet(BaseViewSet):
IssueRelationSerializer(issue_relation).data,
cls=DjangoJSONEncoder,
)
issue_relation.delete(soft=False)
issue_relation.delete()
issue_activity.delay(
type="issue_relation.activity.deleted",
requested_data=json.dumps(request.data, cls=DjangoJSONEncoder),

View File

@@ -3,14 +3,7 @@ import json
# Django imports
from django.utils import timezone
from django.db.models import (
OuterRef,
Func,
F,
Q,
Value,
UUIDField,
)
from django.db.models import OuterRef, Func, F, Q, Value, UUIDField, Subquery
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
from django.contrib.postgres.aggregates import ArrayAgg
@@ -28,7 +21,8 @@ from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
Issue,
IssueLink,
IssueAttachment,
FileAsset,
CycleIssue,
)
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.user_timezone_converter import user_timezone_converter
@@ -48,7 +42,13 @@ class SubIssuesEndpoint(BaseAPIView):
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -56,8 +56,9 @@ class SubIssuesEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -76,7 +77,10 @@ class SubIssuesEndpoint(BaseAPIView):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -84,8 +88,11 @@ class SubIssuesEndpoint(BaseAPIView):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -93,7 +100,11 @@ class SubIssuesEndpoint(BaseAPIView):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),

View File

@@ -14,9 +14,12 @@ from django.db.models import (
Value,
Sum,
FloatField,
Case,
When,
)
from django.db.models.functions import Coalesce, Cast
from django.db.models.functions import Coalesce, Cast, Concat
from django.utils import timezone
from django.db import models
# Third party imports
from rest_framework import status
@@ -54,6 +57,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
state__group="cancelled",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -63,6 +67,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
state__group="completed",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -72,6 +77,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
state__group="started",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -81,6 +87,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
state__group="unstarted",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -90,6 +97,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
state__group="backlog",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -98,6 +106,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
total_issues = (
Issue.issue_objects.filter(
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -108,6 +117,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="completed",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -122,6 +132,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
Issue.issue_objects.filter(
estimate_point__estimate__type="points",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -136,6 +147,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="backlog",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -150,6 +162,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="unstarted",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -164,6 +177,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="started",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -178,6 +192,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
estimate_point__estimate__type="points",
state__group="cancelled",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -334,6 +349,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
project_id=self.kwargs.get("project_id"),
parent__isnull=False,
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -357,6 +373,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -364,12 +381,31 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(display_name=F("assignees__display_name"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"avatar_url",
"display_name",
)
.annotate(
@@ -437,7 +473,9 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
)
.order_by("label_name")
)
data["estimate_distribution"]["assignees"] = assignee_distribution
data["estimate_distribution"][
"assignees"
] = assignee_distribution
data["estimate_distribution"]["labels"] = label_distribution
if modules and modules.start_date and modules.target_date:
@@ -454,6 +492,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -461,12 +500,31 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(display_name=F("assignees__display_name"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"avatar_url",
"display_name",
)
.annotate(
@@ -504,6 +562,7 @@ class ModuleArchiveUnarchiveEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)

View File

@@ -18,8 +18,11 @@ from django.db.models import (
Value,
Sum,
FloatField,
Case,
When,
)
from django.db.models.functions import Coalesce, Cast
from django.db import models
from django.db.models.functions import Coalesce, Cast, Concat
from django.core.serializers.json import DjangoJSONEncoder
from django.utils import timezone
@@ -30,6 +33,7 @@ from rest_framework.response import Response
# Module imports
from plane.app.permissions import (
ProjectEntityPermission,
ProjectLitePermission,
allow_permission,
ROLE,
)
@@ -81,6 +85,7 @@ class ModuleViewSet(BaseViewSet):
Issue.issue_objects.filter(
state__group="cancelled",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -90,6 +95,7 @@ class ModuleViewSet(BaseViewSet):
Issue.issue_objects.filter(
state__group="completed",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -99,6 +105,7 @@ class ModuleViewSet(BaseViewSet):
Issue.issue_objects.filter(
state__group="started",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -108,6 +115,7 @@ class ModuleViewSet(BaseViewSet):
Issue.issue_objects.filter(
state__group="unstarted",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -117,6 +125,7 @@ class ModuleViewSet(BaseViewSet):
Issue.issue_objects.filter(
state__group="backlog",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -125,6 +134,7 @@ class ModuleViewSet(BaseViewSet):
total_issues = (
Issue.issue_objects.filter(
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(cnt=Count("pk"))
@@ -135,6 +145,7 @@ class ModuleViewSet(BaseViewSet):
estimate_point__estimate__type="points",
state__group="completed",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -149,6 +160,7 @@ class ModuleViewSet(BaseViewSet):
Issue.issue_objects.filter(
estimate_point__estimate__type="points",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -163,6 +175,7 @@ class ModuleViewSet(BaseViewSet):
estimate_point__estimate__type="points",
state__group="backlog",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -177,6 +190,7 @@ class ModuleViewSet(BaseViewSet):
estimate_point__estimate__type="points",
state__group="unstarted",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -191,6 +205,7 @@ class ModuleViewSet(BaseViewSet):
estimate_point__estimate__type="points",
state__group="started",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -205,6 +220,7 @@ class ModuleViewSet(BaseViewSet):
estimate_point__estimate__type="points",
state__group="cancelled",
issue_module__module_id=OuterRef("pk"),
issue_module__deleted_at__isnull=True,
)
.values("issue_module__module_id")
.annotate(
@@ -317,13 +333,12 @@ class ModuleViewSet(BaseViewSet):
.order_by("-is_favorite", "-created_at")
)
allow_permission(
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
def create(self, request, slug, project_id):
project = Project.objects.get(workspace__slug=slug, pk=project_id)
serializer = ModuleWriteSerializer(
@@ -386,8 +401,7 @@ class ModuleViewSet(BaseViewSet):
return Response(module, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
queryset = self.get_queryset().filter(archived_at__isnull=True)
if self.fields:
@@ -435,13 +449,7 @@ class ModuleViewSet(BaseViewSet):
)
return Response(modules, status=status.HTTP_200_OK)
allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def retrieve(self, request, slug, project_id, pk):
queryset = (
self.get_queryset()
@@ -452,6 +460,7 @@ class ModuleViewSet(BaseViewSet):
project_id=self.kwargs.get("project_id"),
parent__isnull=False,
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -481,6 +490,7 @@ class ModuleViewSet(BaseViewSet):
assignee_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -488,12 +498,31 @@ class ModuleViewSet(BaseViewSet):
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(display_name=F("assignees__display_name"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"avatar_url",
"display_name",
)
.annotate(
@@ -527,6 +556,7 @@ class ModuleViewSet(BaseViewSet):
label_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -578,6 +608,7 @@ class ModuleViewSet(BaseViewSet):
assignee_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -585,12 +616,31 @@ class ModuleViewSet(BaseViewSet):
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(display_name=F("assignees__display_name"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"avatar_url",
"display_name",
)
.annotate(
@@ -628,6 +678,7 @@ class ModuleViewSet(BaseViewSet):
label_distribution = (
Issue.issue_objects.filter(
issue_module__module_id=pk,
issue_module__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -672,7 +723,13 @@ class ModuleViewSet(BaseViewSet):
"labels": label_distribution,
"completion_chart": {},
}
if modules and modules.start_date and modules.target_date:
if (
modules
and modules.start_date
and modules.target_date
and modules.total_issues > 0
):
data["distribution"]["completion_chart"] = burndown_plot(
queryset=modules,
slug=slug,
@@ -838,6 +895,9 @@ class ModuleLinkViewSet(BaseViewSet):
class ModuleFavoriteViewSet(BaseViewSet):
model = UserFavorite
permission_classes = [
ProjectLitePermission,
]
def get_queryset(self):
return self.filter_queryset(

View File

@@ -1,12 +1,7 @@
# Python imports
import json
from django.db.models import (
F,
Func,
OuterRef,
Q,
)
from django.db.models import F, Func, OuterRef, Q, Subquery
# Django Imports
from django.utils import timezone
@@ -24,10 +19,11 @@ from plane.app.serializers import (
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
ModuleIssue,
Project,
CycleIssue,
)
from plane.utils.grouper import (
issue_group_values,
@@ -62,10 +58,17 @@ class ModuleIssueViewSet(BaseViewSet):
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
issue_module__module_id=self.kwargs.get("module_id"),
issue_module__deleted_at__isnull=True,
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -73,8 +76,9 @@ class ModuleIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -156,10 +160,10 @@ class ModuleIssueViewSet(BaseViewSet):
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -185,10 +189,10 @@ class ModuleIssueViewSet(BaseViewSet):
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),

View File

@@ -53,9 +53,9 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
mentioned = request.GET.get("mentioned", False)
q_filters = Q()
inbox_issue = Issue.objects.filter(
intake_issue = Issue.objects.filter(
pk=OuterRef("entity_identifier"),
issue_inbox__status__in=[0, 2, -2],
issue_intake__status__in=[0, 2, -2],
workspace__slug=self.kwargs.get("slug"),
)
@@ -64,7 +64,8 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
workspace__slug=slug, receiver_id=request.user.id
)
.filter(entity_name="issue")
.annotate(is_inbox_issue=Exists(inbox_issue))
.annotate(is_inbox_issue=Exists(intake_issue))
.annotate(is_intake_issue=Exists(intake_issue))
.annotate(
is_mentioned_notification=Case(
When(sender__icontains="mentioned", then=True),

View File

@@ -18,7 +18,7 @@ from django.db.models.functions import Coalesce
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import allow_permission, ROLE
from plane.app.serializers import (
PageLogSerializer,
@@ -35,10 +35,7 @@ from plane.db.models import (
Project,
)
from plane.utils.error_codes import ERROR_CODES
# Module imports
from ..base import BaseAPIView, BaseViewSet
from plane.bgtasks.page_transaction_task import page_transaction
from plane.bgtasks.page_version_task import page_version
from plane.bgtasks.recent_visited_task import recent_visited_task

View File

@@ -38,7 +38,7 @@ from plane.app.permissions import (
from plane.db.models import (
UserFavorite,
Cycle,
Inbox,
Intake,
DeployBoard,
IssueUserProperty,
Issue,
@@ -53,6 +53,7 @@ from plane.db.models import (
from plane.utils.cache import cache_response
from plane.bgtasks.webhook_task import model_activity
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.exception_logger import log_exception
class ProjectViewSet(BaseViewSet):
@@ -413,10 +414,24 @@ class ProjectViewSet(BaseViewSet):
status=status.HTTP_410_GONE,
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def partial_update(self, request, slug, pk=None):
try:
if not ProjectMember.objects.filter(
member=request.user,
workspace__slug=slug,
project_id=pk,
role=20,
is_active=True,
).exists():
return Response(
{"error": "You don't have the required permissions."},
status=status.HTTP_403_FORBIDDEN,
)
workspace = Workspace.objects.get(slug=slug)
intake_view = request.data.get(
"inbox_view", request.data.get("intake_view", False)
)
project = Project.objects.get(pk=pk)
current_instance = json.dumps(
@@ -430,21 +445,24 @@ class ProjectViewSet(BaseViewSet):
serializer = ProjectSerializer(
project,
data={**request.data},
data={
**request.data,
"intake_view": intake_view,
},
context={"workspace_id": workspace.id},
partial=True,
)
if serializer.is_valid():
serializer.save()
if serializer.data["inbox_view"]:
inbox = Inbox.objects.filter(
if intake_view:
intake = Intake.objects.filter(
project=project,
is_default=True,
).first()
if not inbox:
Inbox.objects.create(
name=f"{project.name} Inbox",
if not intake:
Intake.objects.create(
name=f"{project.name} Intake",
project=project,
is_default=True,
)
@@ -453,7 +471,7 @@ class ProjectViewSet(BaseViewSet):
State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Inbox Issues",
description="Default state for managing all Intake Issues",
project_id=pk,
color="#ff7700",
is_triage=True,
@@ -497,6 +515,44 @@ class ProjectViewSet(BaseViewSet):
status=status.HTTP_410_GONE,
)
def destroy(self, request, slug, pk):
if (
WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=slug,
is_active=True,
role=20,
).exists()
or ProjectMember.objects.filter(
member=request.user,
workspace__slug=slug,
project_id=pk,
role=20,
is_active=True,
).exists()
):
project = Project.objects.get(pk=pk)
project.delete()
# Delete the project members
DeployBoard.objects.filter(
project_id=pk,
workspace__slug=slug,
).delete()
# Delete the user favorite
UserFavorite.objects.filter(
project_id=pk,
workspace__slug=slug,
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
else:
return Response(
{"error": "You don't have the required permissions."},
status=status.HTTP_403_FORBIDDEN,
)
class ProjectArchiveUnarchiveEndpoint(BaseAPIView):
@@ -671,18 +727,22 @@ class ProjectPublicCoverImagesEndpoint(BaseAPIView):
"Prefix": "static/project-cover/",
}
response = s3.list_objects_v2(**params)
# Extracting file keys from the response
if "Contents" in response:
for content in response["Contents"]:
if not content["Key"].endswith(
"/"
): # This line ensures we're only getting files, not "sub-folders"
files.append(
f"https://{settings.AWS_STORAGE_BUCKET_NAME}.s3.{settings.AWS_REGION}.amazonaws.com/{content['Key']}"
)
try:
response = s3.list_objects_v2(**params)
# Extracting file keys from the response
if "Contents" in response:
for content in response["Contents"]:
if not content["Key"].endswith(
"/"
): # This line ensures we're only getting files, not "sub-folders"
files.append(
f"https://{settings.AWS_STORAGE_BUCKET_NAME}.s3.{settings.AWS_REGION}.amazonaws.com/{content['Key']}"
)
return Response(files, status=status.HTTP_200_OK)
return Response(files, status=status.HTTP_200_OK)
except Exception as e:
log_exception(e)
return Response([], status=status.HTTP_200_OK)
class DeployBoardViewSet(BaseViewSet):
@@ -705,7 +765,7 @@ class DeployBoardViewSet(BaseViewSet):
def create(self, request, slug, project_id):
comments = request.data.get("is_comments_enabled", False)
reactions = request.data.get("is_reactions_enabled", False)
inbox = request.data.get("inbox", None)
intake = request.data.get("intake", None)
votes = request.data.get("is_votes_enabled", False)
views = request.data.get(
"views",
@@ -723,7 +783,7 @@ class DeployBoardViewSet(BaseViewSet):
entity_identifier=project_id,
project_id=project_id,
)
project_deploy_board.inbox = inbox
project_deploy_board.intake = intake
project_deploy_board.view_props = views
project_deploy_board.is_votes_enabled = votes
project_deploy_board.is_comments_enabled = comments

View File

@@ -42,47 +42,56 @@ class IssueSearchEndpoint(BaseAPIView):
issues = search_issues(query, issues)
if parent == "true" and issue_id:
issue = Issue.issue_objects.get(pk=issue_id)
issues = issues.filter(
~Q(pk=issue_id), ~Q(pk=issue.parent_id), ~Q(parent_id=issue_id)
)
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(
~Q(pk=issue_id),
~Q(pk=issue.parent_id),
~Q(parent_id=issue_id),
)
if issue_relation == "true" and issue_id:
issue = Issue.issue_objects.get(pk=issue_id)
issues = issues.filter(
~Q(pk=issue_id),
~Q(
issue_related__issue=issue,
issue_related__deleted_at__isnull=True,
),
~Q(
issue_relation__related_issue=issue,
issue_related__deleted_at__isnull=True,
),
)
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(
~Q(pk=issue_id),
~(
Q(issue_related__issue=issue)
& Q(issue_related__deleted_at__isnull=True)
),
~(
Q(issue_relation__related_issue=issue)
& Q(issue_relation__deleted_at__isnull=True)
),
)
if sub_issue == "true" and issue_id:
issue = Issue.issue_objects.get(pk=issue_id)
issues = issues.filter(~Q(pk=issue_id), parent__isnull=True)
issue = Issue.issue_objects.filter(pk=issue_id).first()
if issue:
issues = issues.filter(~Q(pk=issue_id), parent__isnull=True)
if issue.parent:
issues = issues.filter(~Q(pk=issue.parent_id))
if cycle == "true":
issues = issues.exclude(issue_cycle__isnull=False)
issues = issues.exclude(
Q(issue_cycle__isnull=False)
& Q(issue_cycle__deleted_at__isnull=True)
)
if module:
issues = issues.exclude(issue_module__module=module)
issues = issues.exclude(
Q(issue_module__module=module)
& Q(issue_module__deleted_at__isnull=True)
)
if target_date == "none":
issues = issues.filter(target_date__isnull=True)
if ProjectMember.objects.filter(
project_id=project_id,
member=self.request.user,
is_active=True,
role=5
role=5,
).exists():
issues = issues.filter(
created_by=self.request.user
)
issues = issues.filter(created_by=self.request.user)
return Response(
issues.values(

View File

@@ -9,6 +9,7 @@ from django.db.models import (
Q,
UUIDField,
Value,
Subquery,
)
from django.db.models.functions import Coalesce
from django.utils.decorators import method_decorator
@@ -29,13 +30,14 @@ from plane.app.serializers import (
)
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
IssueView,
Workspace,
WorkspaceMember,
ProjectMember,
Project,
CycleIssue,
)
from plane.utils.grouper import (
issue_group_values,
@@ -205,7 +207,13 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -213,8 +221,9 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -233,7 +242,10 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -241,8 +253,11 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -250,8 +265,11 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -270,7 +288,13 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
issue_queryset = (
self.get_queryset()
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
)
# check for the project member role, if the role is 5 then check for the guest_view_all_features if it is true then show all the issues else show only the issues created by the user
@@ -346,10 +370,10 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -375,10 +399,10 @@ class WorkspaceViewIssuesViewSet(BaseViewSet):
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -431,8 +455,7 @@ class IssueViewViewSet(BaseViewSet):
.distinct()
)
allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
@allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
queryset = self.get_queryset()
project = Project.objects.get(id=project_id)
@@ -457,8 +480,7 @@ class IssueViewViewSet(BaseViewSet):
).data
return Response(views, status=status.HTTP_200_OK)
allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
@allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def retrieve(self, request, slug, project_id, pk):
issue_view = (
self.get_queryset().filter(pk=pk, project_id=project_id).first()
@@ -498,8 +520,7 @@ class IssueViewViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
allow_permission(allowed_roles=[], creator=True, model=IssueView)
@allow_permission(allowed_roles=[], creator=True, model=IssueView)
def partial_update(self, request, slug, project_id, pk):
with transaction.atomic():
issue_view = IssueView.objects.select_for_update().get(
@@ -532,8 +553,9 @@ class IssueViewViewSet(BaseViewSet):
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
allow_permission(allowed_roles=[ROLE.ADMIN], creator=True, model=IssueView)
@allow_permission(
allowed_roles=[ROLE.ADMIN], creator=True, model=IssueView
)
def destroy(self, request, slug, project_id, pk):
project_view = IssueView.objects.get(
pk=pk,
@@ -578,8 +600,7 @@ class IssueViewFavoriteViewSet(BaseViewSet):
.select_related("view")
)
allow_permission([ROLE.ADMIN, ROLE.MEMBER])
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def create(self, request, slug, project_id):
_ = UserFavorite.objects.create(
user=request.user,
@@ -589,8 +610,7 @@ class IssueViewFavoriteViewSet(BaseViewSet):
)
return Response(status=status.HTTP_204_NO_CONTENT)
allow_permission([ROLE.ADMIN, ROLE.MEMBER])
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def destroy(self, request, slug, project_id, view_id):
view_favorite = UserFavorite.objects.get(
project=project_id,

View File

@@ -33,6 +33,7 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -43,6 +44,8 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -53,6 +56,8 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -63,6 +68,8 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -73,6 +80,8 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -83,6 +92,8 @@ class WorkspaceCyclesEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
issue_cycle__deleted_at__isnull=True,
),
)
)

View File

@@ -0,0 +1,352 @@
# Python imports
import json
# Django imports
from django.utils import timezone
from django.core import serializers
from django.core.serializers.json import DjangoJSONEncoder
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import (
Q,
UUIDField,
Value,
Subquery,
OuterRef,
)
from django.db.models.functions import Coalesce
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
# Third Party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import allow_permission, ROLE
from plane.app.serializers import (
IssueCreateSerializer,
DraftIssueCreateSerializer,
DraftIssueSerializer,
DraftIssueDetailSerializer,
)
from plane.db.models import (
Issue,
DraftIssue,
CycleIssue,
ModuleIssue,
DraftIssueCycle,
Workspace,
FileAsset,
)
from .. import BaseViewSet
from plane.bgtasks.issue_activities_task import issue_activity
from plane.utils.issue_filters import issue_filters
class WorkspaceDraftIssueViewSet(BaseViewSet):
model = DraftIssue
def get_queryset(self):
return (
DraftIssue.objects.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related(
"assignees", "labels", "draft_issue_module__module"
)
.annotate(
cycle_id=Subquery(
DraftIssueCycle.objects.filter(
draft_issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=Q(
~Q(labels__id__isnull=True)
& (Q(draft_label_issue__deleted_at__isnull=True))
),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(draft_issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"draft_issue_module__module_id",
distinct=True,
filter=Q(
~Q(draft_issue_module__module_id__isnull=True)
& Q(
draft_issue_module__module__archived_at__isnull=True
)
& Q(draft_issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
@method_decorator(gzip_page)
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE"
)
def list(self, request, slug):
filters = issue_filters(request.query_params, "GET")
issues = (
self.get_queryset()
.filter(created_by=request.user)
.order_by("-created_at")
)
issues = issues.filter(**filters)
# List Paginate
return self.paginate(
request=request,
queryset=(issues),
on_results=lambda issues: DraftIssueSerializer(
issues,
many=True,
).data,
)
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE"
)
def create(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
serializer = DraftIssueCreateSerializer(
data=request.data,
context={
"workspace_id": workspace.id,
"project_id": request.data.get("project_id", None),
},
)
if serializer.is_valid():
serializer.save()
issue = (
self.get_queryset()
.filter(pk=serializer.data.get("id"))
.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"created_at",
"updated_at",
"created_by",
"updated_by",
"type_id",
"description_html",
)
.first()
)
return Response(issue, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER],
creator=True,
model=Issue,
level="WORKSPACE",
)
def partial_update(self, request, slug, pk):
issue = (
self.get_queryset().filter(pk=pk, created_by=request.user).first()
)
if not issue:
return Response(
{"error": "Issue not found"},
status=status.HTTP_404_NOT_FOUND,
)
serializer = DraftIssueCreateSerializer(
issue,
data=request.data,
partial=True,
context={
"project_id": request.data.get("project_id", None),
"cycle_id": request.data.get("cycle_id", "not_provided"),
},
)
if serializer.is_valid():
serializer.save()
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission(
allowed_roles=[ROLE.ADMIN],
creator=True,
model=Issue,
level="WORKSPACE",
)
def retrieve(self, request, slug, pk=None):
issue = (
self.get_queryset().filter(pk=pk, created_by=request.user).first()
)
if not issue:
return Response(
{"error": "The required object does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
serializer = DraftIssueDetailSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission(
allowed_roles=[ROLE.ADMIN],
creator=True,
model=DraftIssue,
level="WORKSPACE",
)
def destroy(self, request, slug, pk=None):
draft_issue = DraftIssue.objects.get(workspace__slug=slug, pk=pk)
draft_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER],
level="WORKSPACE",
)
def create_draft_to_issue(self, request, slug, draft_id):
draft_issue = self.get_queryset().filter(pk=draft_id).first()
if not draft_issue.project_id:
return Response(
{"error": "Project is required to create an issue."},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = IssueCreateSerializer(
data=request.data,
context={
"project_id": draft_issue.project_id,
"workspace_id": draft_issue.project.workspace_id,
"default_assignee_id": draft_issue.project.default_assignee_id,
},
)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
type="issue.activity.created",
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(draft_issue.project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
if request.data.get("cycle_id", None):
created_records = CycleIssue.objects.create(
cycle_id=request.data.get("cycle_id", None),
issue_id=serializer.data.get("id", None),
project_id=draft_issue.project_id,
workspace_id=draft_issue.workspace_id,
created_by_id=draft_issue.created_by_id,
updated_by_id=draft_issue.updated_by_id,
)
# Capture Issue Activity
issue_activity.delay(
type="cycle.activity.created",
requested_data=None,
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{
"updated_cycle_issues": None,
"created_cycle_issues": serializers.serialize(
"json", [created_records]
),
}
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
if request.data.get("module_ids", []):
# bulk create the module
ModuleIssue.objects.bulk_create(
[
ModuleIssue(
module_id=module,
issue_id=serializer.data.get("id", None),
workspace_id=draft_issue.workspace_id,
project_id=draft_issue.project_id,
created_by_id=draft_issue.created_by_id,
updated_by_id=draft_issue.updated_by_id,
)
for module in request.data.get("module_ids", [])
],
batch_size=10,
)
# Update the activity
_ = [
issue_activity.delay(
type="module.activity.created",
requested_data=json.dumps({"module_id": str(module)}),
actor_id=str(request.user.id),
issue_id=serializer.data.get("id", None),
project_id=draft_issue.project_id,
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
for module in request.data.get("module_ids", [])
]
# Update file assets
file_assets = FileAsset.objects.filter(draft_issue_id=draft_id)
file_assets.update(
issue_id=serializer.data.get("id", None),
entity_type=FileAsset.EntityTypeContext.ISSUE_DESCRIPTION,
draft_issue_id=None,
)
# delete the draft issue
draft_issue.delete()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

View File

@@ -3,7 +3,11 @@ from django.db.models import (
CharField,
Count,
Q,
OuterRef,
Subquery,
IntegerField,
)
from django.db.models.functions import Coalesce
from django.db.models.functions import Cast
# Third party modules
@@ -34,6 +38,7 @@ from plane.db.models import (
User,
Workspace,
WorkspaceMember,
DraftIssue,
)
from plane.utils.cache import cache_response, invalidate_cache
@@ -74,7 +79,6 @@ class WorkSpaceMemberViewSet(BaseViewSet):
# Get all active workspace members
workspace_members = self.get_queryset()
if workspace_member.role > 5:
serializer = WorkspaceMemberAdminSerializer(
workspace_members,
@@ -284,10 +288,26 @@ class WorkspaceMemberUserViewsEndpoint(BaseAPIView):
class WorkspaceMemberUserEndpoint(BaseAPIView):
def get(self, request, slug):
workspace_member = WorkspaceMember.objects.get(
member=request.user,
workspace__slug=slug,
is_active=True,
draft_issue_count = (
DraftIssue.objects.filter(
created_by=request.user,
workspace_id=OuterRef("workspace_id"),
)
.values("workspace_id")
.annotate(count=Count("id"))
.values("count")
)
workspace_member = (
WorkspaceMember.objects.filter(
member=request.user, workspace__slug=slug, is_active=True
)
.annotate(
draft_issue_count=Coalesce(
Subquery(draft_issue_count, output_field=IntegerField()), 0
)
)
.first()
)
serializer = WorkspaceMemberMeSerializer(workspace_member)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -45,6 +45,7 @@ class WorkspaceModulesEndpoint(BaseAPIView):
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
),
@@ -56,6 +57,7 @@ class WorkspaceModulesEndpoint(BaseAPIView):
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -67,6 +69,7 @@ class WorkspaceModulesEndpoint(BaseAPIView):
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -78,6 +81,7 @@ class WorkspaceModulesEndpoint(BaseAPIView):
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -89,6 +93,7 @@ class WorkspaceModulesEndpoint(BaseAPIView):
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -100,6 +105,7 @@ class WorkspaceModulesEndpoint(BaseAPIView):
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)

View File

@@ -14,6 +14,7 @@ from django.db.models import (
Q,
Value,
When,
Subquery,
)
from django.db.models.fields import DateField
from django.db.models.functions import Cast, ExtractWeek
@@ -40,7 +41,7 @@ from plane.db.models import (
CycleIssue,
Issue,
IssueActivity,
IssueAttachment,
FileAsset,
IssueLink,
IssueSubscriber,
Project,
@@ -120,7 +121,13 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
.filter(**filters)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -128,8 +135,9 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -196,10 +204,10 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -223,10 +231,10 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
@@ -359,8 +367,8 @@ class WorkspaceUserProfileEndpoint(BaseAPIView):
"email": user_data.email,
"first_name": user_data.first_name,
"last_name": user_data.last_name,
"avatar": user_data.avatar,
"cover_image": user_data.cover_image,
"avatar_url": user_data.avatar_url,
"cover_image_url": user_data.cover_image_url,
"date_joined": user_data.date_joined,
"user_timezone": user_data.user_timezone,
"display_name": user_data.display_name,
@@ -504,7 +512,7 @@ class WorkspaceUserProfileStatsEndpoint(BaseAPIView):
upcoming_cycles = CycleIssue.objects.filter(
workspace__slug=slug,
cycle__start_date__gt=timezone.now().date(),
cycle__start_date__gt=timezone.now(),
issue__assignees__in=[
user_id,
],
@@ -512,8 +520,8 @@ class WorkspaceUserProfileStatsEndpoint(BaseAPIView):
present_cycle = CycleIssue.objects.filter(
workspace__slug=slug,
cycle__start_date__lt=timezone.now().date(),
cycle__end_date__gt=timezone.now().date(),
cycle__start_date__lt=timezone.now(),
cycle__end_date__gt=timezone.now(),
issue__assignees__in=[
user_id,
],

View File

@@ -3,6 +3,7 @@ import requests
# Django imports
from django.utils import timezone
from django.db import DatabaseError, IntegrityError
# Module imports
from plane.db.models import Account
@@ -12,6 +13,7 @@ from plane.authentication.adapter.error import (
AuthenticationException,
AUTHENTICATION_ERROR_CODES,
)
from plane.utils.exception_logger import log_exception
class OauthAdapter(Adapter):
@@ -97,20 +99,48 @@ class OauthAdapter(Adapter):
self.user_data = data
def create_update_account(self, user):
account, created = Account.objects.update_or_create(
user=user,
provider=self.provider,
provider_account_id=self.user_data.get("user").get("provider_id"),
defaults={
"access_token": self.token_data.get("access_token"),
"refresh_token": self.token_data.get("refresh_token", None),
"access_token_expired_at": self.token_data.get(
try:
# Check if the account already exists
account = Account.objects.filter(
user=user,
provider=self.provider,
provider_account_id=self.user_data.get("user").get(
"provider_id"
),
).first()
# Update the account if it exists
if account:
account.access_token = self.token_data.get("access_token")
account.refresh_token = self.token_data.get(
"refresh_token", None
)
account.access_token_expired_at = self.token_data.get(
"access_token_expired_at"
),
"refresh_token_expired_at": self.token_data.get(
)
account.refresh_token_expired_at = self.token_data.get(
"refresh_token_expired_at"
),
"last_connected_at": timezone.now(),
"id_token": self.token_data.get("id_token", ""),
},
)
)
account.last_connected_at = timezone.now()
account.id_token = self.token_data.get("id_token", "")
account.save()
# Create a new account if it does not exist
else:
Account.objects.create(
user=user,
provider=self.provider,
provider_account_id=self.user_data.get("user", {}).get(
"provider_id"
),
access_token=self.token_data.get("access_token"),
refresh_token=self.token_data.get("refresh_token", None),
access_token_expired_at=self.token_data.get(
"access_token_expired_at"
),
refresh_token_expired_at=self.token_data.get(
"refresh_token_expired_at"
),
last_connected_at=timezone.now(),
id_token=self.token_data.get("id_token", ""),
)
except (DatabaseError, IntegrityError) as e:
log_exception(e)

View File

@@ -10,6 +10,9 @@ from celery import shared_task
from django.core.mail import EmailMultiAlternatives, get_connection
from django.template.loader import render_to_string
from django.utils.html import strip_tags
from django.db.models import Q, Case, Value, When
from django.db import models
from django.db.models.functions import Concat
# Module imports
from plane.db.models import Issue
@@ -84,12 +87,37 @@ def get_assignee_details(slug, filters):
"""Fetch assignee details if required."""
return (
Issue.issue_objects.filter(
workspace__slug=slug, **filters, assignees__avatar__isnull=False
Q(
Q(assignees__avatar__isnull=False)
| Q(assignees__avatar_asset__isnull=False)
),
workspace__slug=slug,
**filters,
)
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.distinct("assignees__id")
.order_by("assignees__id")
.values(
"assignees__avatar",
"assignees__avatar_url",
"assignees__display_name",
"assignees__first_name",
"assignees__last_name",
@@ -102,7 +130,10 @@ def get_label_details(slug, filters):
"""Fetch label details if required"""
return (
Issue.objects.filter(
workspace__slug=slug, **filters, labels__id__isnull=False
workspace__slug=slug,
**filters,
labels__id__isnull=False,
label_issue__deleted_at__isnull=True,
)
.distinct("labels__id")
.order_by("labels__id")
@@ -128,6 +159,7 @@ def get_module_details(slug, filters):
workspace__slug=slug,
**filters,
issue_module__module_id__isnull=False,
issue_module__deleted_at__isnull=True,
)
.distinct("issue_module__module_id")
.order_by("issue_module__module_id")
@@ -144,6 +176,7 @@ def get_cycle_details(slug, filters):
workspace__slug=slug,
**filters,
issue_cycle__cycle_id__isnull=False,
issue_cycle__deleted_at__isnull=True,
)
.distinct("issue_cycle__cycle_id")
.order_by("issue_cycle__cycle_id")

View File

@@ -2,6 +2,7 @@
from django.utils import timezone
from django.apps import apps
from django.conf import settings
from django.db import models
from django.core.exceptions import ObjectDoesNotExist
# Third party imports
@@ -18,17 +19,25 @@ def soft_delete_related_objects(
for field in related_fields:
if field.one_to_many or field.one_to_one:
try:
if field.one_to_many:
related_objects = getattr(instance, field.name).all()
elif field.one_to_one:
related_object = getattr(instance, field.name)
related_objects = (
[related_object] if related_object is not None else []
)
for obj in related_objects:
if obj:
obj.deleted_at = timezone.now()
obj.save(using=using)
# Check if the field has CASCADE on delete
if (
not hasattr(field.remote_field, "on_delete")
or field.remote_field.on_delete == models.CASCADE
):
if field.one_to_many:
related_objects = getattr(instance, field.name).all()
elif field.one_to_one:
related_object = getattr(instance, field.name)
related_objects = (
[related_object]
if related_object is not None
else []
)
for obj in related_objects:
if obj:
obj.deleted_at = timezone.now()
obj.save(using=using)
except ObjectDoesNotExist:
pass
@@ -154,8 +163,7 @@ def hard_delete():
if hasattr(model, "deleted_at"):
# Get all instances where 'deleted_at' is greater than 30 days ago
_ = model.all_objects.filter(
deleted_at__lt=timezone.now()
- timezone.timedelta(days=days)
deleted_at__lt=timezone.now() - timezone.timedelta(days=days)
).delete()
return

View File

@@ -30,8 +30,8 @@ from plane.db.models import (
Page,
ProjectPage,
PageLabel,
Inbox,
InboxIssue,
Intake,
IntakeIssue,
)
@@ -47,7 +47,7 @@ def create_project(workspace, user_id):
: random.randint(2, 12 if len(name) - 1 >= 12 else len(name) - 1)
].upper(),
created_by_id=user_id,
inbox_view=True,
intake_view=True,
)
# Add current member as project member
@@ -406,18 +406,18 @@ def create_issues(workspace, project, user_id, issue_count):
return issues
def create_inbox_issues(workspace, project, user_id, inbox_issue_count):
issues = create_issues(workspace, project, user_id, inbox_issue_count)
inbox, create = Inbox.objects.get_or_create(
name="Inbox",
def create_intake_issues(workspace, project, user_id, intake_issue_count):
issues = create_issues(workspace, project, user_id, intake_issue_count)
intake, create = Intake.objects.get_or_create(
name="Intake",
project=project,
is_default=True,
)
InboxIssue.objects.bulk_create(
IntakeIssue.objects.bulk_create(
[
InboxIssue(
IntakeIssue(
issue=issue,
inbox=inbox,
intake=intake,
status=(status := [-2, -1, 0, 1, 2][random.randint(0, 4)]),
snoozed_till=(
datetime.now() + timedelta(days=random.randint(1, 30))
@@ -599,7 +599,7 @@ def create_dummy_data(
cycle_count,
module_count,
pages_count,
inbox_issue_count,
intake_issue_count,
):
workspace = Workspace.objects.get(slug=slug)
@@ -660,12 +660,12 @@ def create_dummy_data(
issue_count=issue_count,
)
# create inbox issues
create_inbox_issues(
# create intake issues
create_intake_issues(
workspace=workspace,
project=project,
user_id=user_id,
inbox_issue_count=inbox_issue_count,
intake_issue_count=intake_issue_count,
)
# create issue parent

View File

@@ -224,7 +224,7 @@ def send_email_notification(
{
"actor_comments": comment,
"actor_detail": {
"avatar_url": actor.avatar,
"avatar_url": f"{base_api}{actor.avatar_url}",
"first_name": actor.first_name,
"last_name": actor.last_name,
},
@@ -241,7 +241,7 @@ def send_email_notification(
{
"actor_comments": mention,
"actor_detail": {
"avatar_url": actor.avatar,
"avatar_url": f"{base_api}{actor.avatar_url}",
"first_name": actor.first_name,
"last_name": actor.last_name,
},
@@ -257,7 +257,7 @@ def send_email_notification(
template_data.append(
{
"actor_detail": {
"avatar_url": actor.avatar,
"avatar_url": f"{base_api}{actor.avatar_url}",
"first_name": actor.first_name,
"last_name": actor.last_name,
},

View File

@@ -105,7 +105,6 @@ def upload_to_s3(zip_file, workspace_id, token_id, slug):
ExpiresIn=expires_in,
)
else:
# If endpoint url is present, use it
if settings.AWS_S3_ENDPOINT_URL:
s3 = boto3.client(
@@ -129,7 +128,7 @@ def upload_to_s3(zip_file, workspace_id, token_id, slug):
zip_file,
settings.AWS_STORAGE_BUCKET_NAME,
file_name,
ExtraArgs={"ACL": "public-read", "ContentType": "application/zip"},
ExtraArgs={"ContentType": "application/zip"},
)
# Generate presigned url for the uploaded file

View File

@@ -1,4 +1,5 @@
# Python imports
import os
from datetime import timedelta
# Django imports
@@ -13,16 +14,14 @@ from plane.db.models import FileAsset
@shared_task
def delete_file_asset():
# file assets to delete
file_assets_to_delete = FileAsset.objects.filter(
Q(is_deleted=True)
& Q(updated_at__lte=timezone.now() - timedelta(days=7))
)
# Delete the file from storage and the file object from the database
for file_asset in file_assets_to_delete:
# Delete the file from storage
file_asset.asset.delete(save=False)
# Delete the file object
file_asset.delete()
def delete_unuploaded_file_asset():
"""This task deletes unuploaded file assets older than a certain number of days."""
FileAsset.objects.filter(
Q(
created_at__lt=timezone.now()
- timedelta(
days=int(os.environ.get("UNUPLOADED_ASSET_DELETE_DAYS", "7"))
)
)
& Q(is_uploaded=False)
).delete()

View File

@@ -31,6 +31,7 @@ from plane.db.models import (
from plane.settings.redis import redis_instance
from plane.utils.exception_logger import log_exception
from plane.bgtasks.webhook_task import webhook_activity
from plane.utils.issue_relation_mapper import get_inverse_relation
# Track Changes in name
@@ -465,7 +466,7 @@ def track_estimate_points(
IssueActivity(
issue_id=issue_id,
actor_id=actor_id,
verb="updated",
verb="removed" if new_estimate is None else "updated",
old_identifier=(
current_instance.get("estimate_point")
if current_instance.get("estimate_point") is not None
@@ -1394,6 +1395,9 @@ def create_issue_relation_activity(
epoch=epoch,
)
)
inverse_relation = get_inverse_relation(
requested_data.get("relation_type")
)
issue = Issue.objects.get(pk=issue_id)
issue_activities.append(
IssueActivity(
@@ -1402,19 +1406,10 @@ def create_issue_relation_activity(
verb="updated",
old_value="",
new_value=f"{issue.project.identifier}-{issue.sequence_id}",
field=(
"blocking"
if requested_data.get("relation_type") == "blocked_by"
else (
"blocked_by"
if requested_data.get("relation_type")
== "blocking"
else requested_data.get("relation_type")
)
),
field=inverse_relation,
project_id=project_id,
workspace_id=workspace_id,
comment=f'added {"blocking" if requested_data.get("relation_type") == "blocked_by" else ("blocked_by" if requested_data.get("relation_type") == "blocking" else requested_data.get("relation_type")),} relation',
comment=f"added {inverse_relation} relation",
old_identifier=issue_id,
epoch=epoch,
)
@@ -1572,7 +1567,7 @@ def delete_draft_issue_activity(
)
def create_inbox_activity(
def create_intake_activity(
requested_data,
current_instance,
issue_id,
@@ -1601,8 +1596,8 @@ def create_inbox_activity(
issue_id=issue_id,
project_id=project_id,
workspace_id=workspace_id,
comment="updated the inbox status",
field="inbox",
comment="updated the intake status",
field="intake",
verb=requested_data.get("status"),
actor_id=actor_id,
epoch=epoch,
@@ -1625,7 +1620,7 @@ def issue_activity(
subscriber=True,
notification=False,
origin=None,
inbox=None,
intake=None,
):
try:
issue_activities = []
@@ -1673,7 +1668,7 @@ def issue_activity(
"issue_draft.activity.created": create_draft_issue_activity,
"issue_draft.activity.updated": update_draft_issue_activity,
"issue_draft.activity.deleted": delete_draft_issue_activity,
"inbox.activity.created": create_inbox_activity,
"intake.activity.created": create_intake_activity,
}
func = ACTIVITY_MAPPER.get(type)
@@ -1700,16 +1695,12 @@ def issue_activity(
event=(
"issue_comment"
if activity.field == "comment"
else "inbox_issue"
if inbox
else "issue"
else "intake_issue" if intake else "issue"
),
event_id=(
activity.issue_comment_id
if activity.field == "comment"
else inbox
if inbox
else activity.issue_id
else intake if intake else activity.issue_id
),
verb=activity.verb,
field=(

View File

@@ -42,21 +42,19 @@ def archive_old_issues():
),
Q(issue_cycle__isnull=True)
| (
Q(issue_cycle__cycle__end_date__lt=timezone.now().date())
Q(issue_cycle__cycle__end_date__lt=timezone.now())
& Q(issue_cycle__isnull=False)
),
Q(issue_module__isnull=True)
| (
Q(
issue_module__module__target_date__lt=timezone.now().date()
)
Q(issue_module__module__target_date__lt=timezone.now())
& Q(issue_module__isnull=False)
),
).filter(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True)
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True)
)
# Check if Issues
@@ -122,21 +120,19 @@ def close_old_issues():
),
Q(issue_cycle__isnull=True)
| (
Q(issue_cycle__cycle__end_date__lt=timezone.now().date())
Q(issue_cycle__cycle__end_date__lt=timezone.now())
& Q(issue_cycle__isnull=False)
),
Q(issue_module__isnull=True)
| (
Q(
issue_module__module__target_date__lt=timezone.now().date()
)
Q(issue_module__module__target_date__lt=timezone.now())
& Q(issue_module__isnull=False)
),
).filter(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True)
Q(issue_intake__status=1)
| Q(issue_intake__status=-1)
| Q(issue_intake__status=2)
| Q(issue_intake__isnull=True)
)
# Check if Issues

View File

@@ -0,0 +1,28 @@
# Third party imports
from celery import shared_task
# Module imports
from plane.db.models import FileAsset
from plane.settings.storage import S3Storage
from plane.utils.exception_logger import log_exception
@shared_task
def get_asset_object_metadata(asset_id):
try:
# Get the asset
asset = FileAsset.objects.get(pk=asset_id)
# Create an instance of the S3 storage
storage = S3Storage()
# Get the storage
asset.storage_metadata = storage.get_object_metadata(
object_name=asset.asset.name
)
# Save the asset
asset.save(update_fields=["storage_metadata"])
return
except FileAsset.DoesNotExist:
return
except Exception as e:
log_exception(e)
return

View File

@@ -27,7 +27,7 @@ from plane.api.serializers import (
ModuleSerializer,
ProjectSerializer,
UserLiteSerializer,
InboxIssueSerializer,
IntakeIssueSerializer,
)
from plane.db.models import (
Cycle,
@@ -40,7 +40,7 @@ from plane.db.models import (
User,
Webhook,
WebhookLog,
InboxIssue,
IntakeIssue,
)
from plane.license.utils.instance_value import get_email_configuration
from plane.utils.exception_logger import log_exception
@@ -54,7 +54,7 @@ SERIALIZER_MAPPER = {
"module_issue": ModuleIssueSerializer,
"issue_comment": IssueCommentSerializer,
"user": UserLiteSerializer,
"inbox_issue": InboxIssueSerializer,
"intake_issue": IntakeIssueSerializer,
}
MODEL_MAPPER = {
@@ -66,7 +66,7 @@ MODEL_MAPPER = {
"module_issue": ModuleIssue,
"issue_comment": IssueComment,
"user": User,
"inbox_issue": InboxIssue,
"intake_issue": IntakeIssue,
}

View File

@@ -25,20 +25,24 @@ app.conf.beat_schedule = {
"schedule": crontab(hour=0, minute=0),
},
"check-every-day-to-delete-file-asset": {
"task": "plane.bgtasks.file_asset_task.delete_file_asset",
"task": "plane.bgtasks.file_asset_task.delete_unuploaded_file_asset",
"schedule": crontab(hour=0, minute=0),
},
"check-every-five-minutes-to-send-email-notifications": {
"task": "plane.bgtasks.email_notification_task.stack_email_notification",
"schedule": crontab(minute="*/5"),
},
"check-every-day-to-delete-hard-delete": {
"task": "plane.bgtasks.deletion_task.hard_delete",
"schedule": crontab(hour=0, minute=0),
},
"check-every-day-to-delete-api-logs": {
"task": "plane.bgtasks.api_logs_task.delete_api_logs",
"schedule": crontab(hour=0, minute=0),
},
"check-every-day-to-delete-hard-delete": {
"task": "plane.bgtasks.deletion_task.hard_delete",
"schedule": crontab(hour=0, minute=0),
"run-every-6-hours-for-instance-trace": {
"task": "plane.license.bgtasks.tracer.instance_traces",
"schedule": crontab(hour="*/6"),
},
}

View File

@@ -1,67 +1,45 @@
# Python imports
import os
import boto3
import json
from botocore.exceptions import ClientError
# Django imports
from django.core.management import BaseCommand
from django.conf import settings
class Command(BaseCommand):
help = "Create the default bucket for the instance"
def set_bucket_public_policy(self, s3_client, bucket_name):
public_policy = {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:GetObject"],
"Resource": [f"arn:aws:s3:::{bucket_name}/*"],
}
],
}
try:
s3_client.put_bucket_policy(
Bucket=bucket_name, Policy=json.dumps(public_policy)
)
self.stdout.write(
self.style.SUCCESS(
f"Public read access policy set for bucket '{bucket_name}'."
)
)
except ClientError as e:
self.stdout.write(
self.style.ERROR(
f"Error setting public read access policy: {e}"
)
)
def handle(self, *args, **options):
# Create a session using the credentials from Django settings
try:
session = boto3.session.Session(
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
s3_client = boto3.client(
"s3",
endpoint_url=os.environ.get(
"AWS_S3_ENDPOINT_URL"
), # MinIO endpoint
aws_access_key_id=os.environ.get(
"AWS_ACCESS_KEY_ID"
), # MinIO access key
aws_secret_access_key=os.environ.get(
"AWS_SECRET_ACCESS_KEY"
), # MinIO secret key
region_name=os.environ.get("AWS_REGION"), # MinIO region
config=boto3.session.Config(signature_version="s3v4"),
)
# Create an S3 client using the session
s3_client = session.client(
"s3", endpoint_url=settings.AWS_S3_ENDPOINT_URL
)
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
# Get the bucket name from the environment
bucket_name = os.environ.get("AWS_S3_BUCKET_NAME")
self.stdout.write(self.style.NOTICE("Checking bucket..."))
# Check if the bucket exists
s3_client.head_bucket(Bucket=bucket_name)
self.set_bucket_public_policy(s3_client, bucket_name)
# If the bucket exists, print a success message
self.stdout.write(
self.style.SUCCESS(f"Bucket '{bucket_name}' exists.")
)
return
except ClientError as e:
error_code = int(e.response["Error"]["Code"])
bucket_name = settings.AWS_STORAGE_BUCKET_NAME
bucket_name = os.environ.get("AWS_S3_BUCKET_NAME")
if error_code == 404:
# Bucket does not exist, create it
self.stdout.write(
@@ -76,13 +54,16 @@ class Command(BaseCommand):
f"Bucket '{bucket_name}' created successfully."
)
)
self.set_bucket_public_policy(s3_client, bucket_name)
# Handle the exception if the bucket creation fails
except ClientError as create_error:
self.stdout.write(
self.style.ERROR(
f"Failed to create bucket: {create_error}"
)
)
# Handle the exception if access to the bucket is forbidden
elif error_code == 403:
# Access to the bucket is forbidden
self.stdout.write(

View File

@@ -62,13 +62,15 @@ class Command(BaseCommand):
project_count = int(input("Number of projects to be created: "))
for i in range(project_count):
print(f"Please provide the following details for project {i+1}:")
print(
f"Please provide the following details for project {i+1}:"
)
issue_count = int(input("Number of issues to be created: "))
cycle_count = int(input("Number of cycles to be created: "))
module_count = int(input("Number of modules to be created: "))
pages_count = int(input("Number of pages to be created: "))
inbox_issue_count = int(
input("Number of inbox issues to be created: ")
intake_issue_count = int(
input("Number of intake issues to be created: ")
)
from plane.bgtasks.dummy_data_task import create_dummy_data
@@ -81,7 +83,7 @@ class Command(BaseCommand):
cycle_count=cycle_count,
module_count=module_count,
pages_count=pages_count,
inbox_issue_count=inbox_issue_count,
intake_issue_count=intake_issue_count,
)
self.stdout.write(

View File

@@ -0,0 +1,223 @@
# Python imports
import os
import boto3
from botocore.exceptions import ClientError
import json
# Django imports
from django.core.management import BaseCommand
class Command(BaseCommand):
help = "Create the default bucket for the instance"
def get_s3_client(self):
s3_client = boto3.client(
"s3",
endpoint_url=os.environ.get(
"AWS_S3_ENDPOINT_URL"
), # MinIO endpoint
aws_access_key_id=os.environ.get(
"AWS_ACCESS_KEY_ID"
), # MinIO access key
aws_secret_access_key=os.environ.get(
"AWS_SECRET_ACCESS_KEY"
), # MinIO secret key
region_name=os.environ.get("AWS_REGION"), # MinIO region
config=boto3.session.Config(signature_version="s3v4"),
)
return s3_client
# Check if the access key has the required permissions
def check_s3_permissions(self, bucket_name):
s3_client = self.get_s3_client()
permissions = {
"s3:GetObject": False,
"s3:ListBucket": False,
"s3:PutBucketPolicy": False,
"s3:PutObject": False,
}
# 1. Test s3:ListBucket (attempt to list the bucket contents)
try:
s3_client.list_objects_v2(Bucket=bucket_name)
permissions["s3:ListBucket"] = True
except ClientError as e:
if e.response["Error"]["Code"] == "AccessDenied":
self.stdout.write("ListBucket permission denied.")
else:
self.stdout.write(f"Error in ListBucket: {e}")
# 2. Test s3:GetObject (attempt to get a specific object)
try:
response = s3_client.list_objects_v2(Bucket=bucket_name)
if "Contents" in response:
test_object_key = response["Contents"][0]["Key"]
s3_client.get_object(Bucket=bucket_name, Key=test_object_key)
permissions["s3:GetObject"] = True
except ClientError as e:
if e.response["Error"]["Code"] == "AccessDenied":
self.stdout.write("GetObject permission denied.")
else:
self.stdout.write(f"Error in GetObject: {e}")
# 3. Test s3:PutObject (attempt to upload an object)
try:
s3_client.put_object(
Bucket=bucket_name,
Key="test_permission_check.txt",
Body=b"Test",
)
permissions["s3:PutObject"] = True
# Clean up
except ClientError as e:
if e.response["Error"]["Code"] == "AccessDenied":
self.stdout.write("PutObject permission denied.")
else:
self.stdout.write(f"Error in PutObject: {e}")
# Clean up
try:
s3_client.delete_object(
Bucket=bucket_name, Key="test_permission_check.txt"
)
except ClientError:
self.stdout.write("Coudn't delete test object")
# 4. Test s3:PutBucketPolicy (attempt to put a bucket policy)
try:
policy = {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": f"arn:aws:s3:::{bucket_name}/*",
}
],
}
s3_client.put_bucket_policy(
Bucket=bucket_name, Policy=json.dumps(policy)
)
permissions["s3:PutBucketPolicy"] = True
except ClientError as e:
if e.response["Error"]["Code"] == "AccessDenied":
self.stdout.write("PutBucketPolicy permission denied.")
else:
self.stdout.write(f"Error in PutBucketPolicy: {e}")
return permissions
def generate_bucket_policy(self, bucket_name):
s3_client = self.get_s3_client()
response = s3_client.list_objects_v2(Bucket=bucket_name)
public_object_resource = []
if "Contents" in response:
for obj in response["Contents"]:
object_key = obj["Key"]
public_object_resource.append(
f"arn:aws:s3:::{bucket_name}/{object_key}"
)
bucket_policy = {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": public_object_resource,
}
],
}
return bucket_policy
def make_objects_public(self, bucket_name):
# Initialize S3 client
s3_client = self.get_s3_client()
# Get the bucket policy
bucket_policy = self.generate_bucket_policy(bucket_name)
# Apply the policy to the bucket
s3_client.put_bucket_policy(
Bucket=bucket_name, Policy=json.dumps(bucket_policy)
)
# Print a success message
self.stdout.write(
"Bucket is private, but existing objects remain public."
)
return
def handle(self, *args, **options):
# Create a session using the credentials from Django settings
# Check if the bucket exists
s3_client = self.get_s3_client()
# Get the bucket name from the environment
bucket_name = os.environ.get("AWS_S3_BUCKET_NAME")
if not bucket_name:
self.stdout.write(
self.style.ERROR(
"Please set the AWS_S3_BUCKET_NAME environment variable."
)
)
return
self.stdout.write(self.style.NOTICE("Checking bucket..."))
# Check if the bucket exists
try:
s3_client.head_bucket(Bucket=bucket_name)
except ClientError as e:
error_code = e.response["Error"]["Code"]
if error_code == "404":
self.stdout.write(
self.style.ERROR(f"Bucket '{bucket_name}' does not exist.")
)
return
else:
self.stdout.write(f"Error: {e}")
# If the bucket exists, print a success message
self.stdout.write(
self.style.SUCCESS(f"Bucket '{bucket_name}' exists.")
)
try:
# Check the permissions of the access key
permissions = self.check_s3_permissions(bucket_name)
except ClientError as e:
self.stdout.write(f"Error: {e}")
except Exception as e:
self.stdout.write(f"Error: {e}")
# If the access key has the required permissions
try:
if all(permissions.values()):
self.stdout.write(
self.style.SUCCESS(
"Access key has the required permissions."
)
)
# Making the existing objects public
self.make_objects_public(bucket_name)
return
except Exception as e:
self.stdout.write(f"Error: {e}")
# write the bucket policy to a file
self.stdout.write(
self.style.WARNING(
"Generating permissions.json for manual bucket policy update."
)
)
try:
# Writing to a file
with open("permissions.json", "w") as f:
f.write(json.dumps(self.generate_bucket_policy(bucket_name)))
self.stdout.write(
self.style.WARNING(
"Permissions have been written to permissions.json."
)
)
return
except IOError as e:
self.stdout.write(f"Error writing permissions.json: {e}")
return

View File

@@ -3,7 +3,6 @@
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
from plane.db.models import IssueRelation
from sentry_sdk import capture_exception
import uuid
@@ -11,6 +10,7 @@ import uuid
def create_issue_relation(apps, schema_editor):
try:
IssueBlockerModel = apps.get_model("db", "IssueBlocker")
IssueRelation = apps.get_model("db", "IssueRelation")
updated_issue_relation = []
for blocked_issue in IssueBlockerModel.objects.all():
updated_issue_relation.append(

View File

@@ -1,11 +1,10 @@
# Generated by Django 4.2.7 on 2024-01-02 13:15
from plane.db.models import WorkspaceUserProperties, ProjectMember, IssueView
from django.db import migrations
def workspace_user_properties(apps, schema_editor):
WorkspaceMember = apps.get_model("db", "WorkspaceMember")
WorkspaceUserProperties = apps.get_model("db", "WorkspaceUserProperties")
updated_workspace_user_properties = []
for workspace_members in WorkspaceMember.objects.all():
updated_workspace_user_properties.append(
@@ -21,12 +20,14 @@ def workspace_user_properties(apps, schema_editor):
)
)
WorkspaceUserProperties.objects.bulk_create(
updated_workspace_user_properties, batch_size=2000
updated_workspace_user_properties,
batch_size=2000,
)
def project_user_properties(apps, schema_editor):
IssueProperty = apps.get_model("db", "IssueProperty")
ProjectMember = apps.get_model("db", "ProjectMember")
updated_issue_user_properties = []
for issue_property in IssueProperty.objects.all():
project_member = ProjectMember.objects.filter(
@@ -49,6 +50,7 @@ def project_user_properties(apps, schema_editor):
def issue_view(apps, schema_editor):
GlobalView = apps.get_model("db", "GlobalView")
IssueView = apps.get_model("db", "IssueView")
updated_issue_views = []
for global_view in GlobalView.objects.all():

View File

@@ -0,0 +1,179 @@
# Generated by Django 4.2.15 on 2024-10-09 06:19
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import plane.db.models.asset
class Migration(migrations.Migration):
dependencies = [
(
"db",
"0077_draftissue_cycle_user_timezone_project_user_timezone_and_more",
),
]
operations = [
migrations.AddField(
model_name="fileasset",
name="comment",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="assets",
to="db.issuecomment",
),
),
migrations.AddField(
model_name="fileasset",
name="entity_type",
field=models.CharField(
blank=True,
choices=[
("ISSUE_ATTACHMENT", "Issue Attachment"),
("ISSUE_DESCRIPTION", "Issue Description"),
("COMMENT_DESCRIPTION", "Comment Description"),
("PAGE_DESCRIPTION", "Page Description"),
("USER_COVER", "User Cover"),
("USER_AVATAR", "User Avatar"),
("WORKSPACE_LOGO", "Workspace Logo"),
("PROJECT_COVER", "Project Cover"),
],
max_length=255,
null=True,
),
),
migrations.AddField(
model_name="fileasset",
name="external_id",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name="fileasset",
name="external_source",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name="fileasset",
name="is_uploaded",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="fileasset",
name="issue",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="assets",
to="db.issue",
),
),
migrations.AddField(
model_name="fileasset",
name="page",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="assets",
to="db.page",
),
),
migrations.AddField(
model_name="fileasset",
name="project",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="assets",
to="db.project",
),
),
migrations.AddField(
model_name="fileasset",
name="size",
field=models.FloatField(default=0),
),
migrations.AddField(
model_name="fileasset",
name="storage_metadata",
field=models.JSONField(blank=True, default=dict, null=True),
),
migrations.AddField(
model_name="fileasset",
name="user",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="assets",
to=settings.AUTH_USER_MODEL,
),
),
migrations.AddField(
model_name="project",
name="cover_image_asset",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="project_cover_image",
to="db.fileasset",
),
),
migrations.AddField(
model_name="user",
name="avatar_asset",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="user_avatar",
to="db.fileasset",
),
),
migrations.AddField(
model_name="user",
name="cover_image_asset",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="user_cover_image",
to="db.fileasset",
),
),
migrations.AddField(
model_name="workspace",
name="logo_asset",
field=models.ForeignKey(
blank=True,
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="workspace_logo",
to="db.fileasset",
),
),
migrations.AlterField(
model_name="fileasset",
name="asset",
field=models.FileField(
max_length=800, upload_to=plane.db.models.asset.get_upload_path
),
),
migrations.AlterField(
model_name="integration",
name="avatar_url",
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name="project",
name="cover_image",
field=models.TextField(blank=True, null=True),
),
migrations.AlterField(
model_name="workspace",
name="logo",
field=models.TextField(blank=True, null=True, verbose_name="Logo"),
),
]

View File

@@ -0,0 +1,64 @@
# Generated by Django 4.2.15 on 2024-10-09 06:19
from django.db import migrations
def move_attachment_to_fileasset(apps, schema_editor):
FileAsset = apps.get_model("db", "FileAsset")
IssueAttachment = apps.get_model("db", "IssueAttachment")
bulk_issue_attachment = []
for issue_attachment in IssueAttachment.objects.values(
"issue_id",
"project_id",
"workspace_id",
"asset",
"attributes",
"external_source",
"external_id",
"deleted_at",
"created_by_id",
"updated_by_id",
):
bulk_issue_attachment.append(
FileAsset(
issue_id=issue_attachment["issue_id"],
entity_type="ISSUE_ATTACHMENT",
project_id=issue_attachment["project_id"],
workspace_id=issue_attachment["workspace_id"],
attributes=issue_attachment["attributes"],
asset=issue_attachment["asset"],
external_source=issue_attachment["external_source"],
external_id=issue_attachment["external_id"],
deleted_at=issue_attachment["deleted_at"],
created_by_id=issue_attachment["created_by_id"],
updated_by_id=issue_attachment["updated_by_id"],
size=issue_attachment["attributes"].get("size", 0),
)
)
FileAsset.objects.bulk_create(bulk_issue_attachment, batch_size=1000)
def mark_existing_file_uploads(apps, schema_editor):
FileAsset = apps.get_model("db", "FileAsset")
# Mark all existing file uploads as uploaded
FileAsset.objects.update(is_uploaded=True)
class Migration(migrations.Migration):
dependencies = [
("db", "0078_fileasset_comment_fileasset_entity_type_and_more"),
]
operations = [
migrations.RunPython(
move_attachment_to_fileasset,
reverse_code=migrations.RunPython.noop,
),
migrations.RunPython(
mark_existing_file_uploads,
reverse_code=migrations.RunPython.noop,
),
]

View File

@@ -0,0 +1,45 @@
# Generated by Django 4.2.15 on 2024-10-12 18:45
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
("db", "0079_auto_20241009_0619"),
]
operations = [
migrations.AddField(
model_name="fileasset",
name="draft_issue",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="assets",
to="db.draftissue",
),
),
migrations.AlterField(
model_name="fileasset",
name="entity_type",
field=models.CharField(
blank=True,
choices=[
("ISSUE_ATTACHMENT", "Issue Attachment"),
("ISSUE_DESCRIPTION", "Issue Description"),
("COMMENT_DESCRIPTION", "Comment Description"),
("PAGE_DESCRIPTION", "Page Description"),
("USER_COVER", "User Cover"),
("USER_AVATAR", "User Avatar"),
("WORKSPACE_LOGO", "Workspace Logo"),
("PROJECT_COVER", "Project Cover"),
("DRAFT_ISSUE_ATTACHMENT", "Draft Issue Attachment"),
("DRAFT_ISSUE_DESCRIPTION", "Draft Issue Description"),
],
max_length=255,
null=True,
),
),
]

View File

@@ -0,0 +1,187 @@
# Generated by Django 4.2.16 on 2024-10-15 11:31
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("db", "0080_fileasset_draft_issue_alter_fileasset_entity_type"),
]
operations = [
migrations.RemoveField(
model_name="globalview",
name="created_by",
),
migrations.RemoveField(
model_name="globalview",
name="updated_by",
),
migrations.RemoveField(
model_name="globalview",
name="workspace",
),
migrations.AlterUniqueTogether(
name="issueviewfavorite",
unique_together=None,
),
migrations.RemoveField(
model_name="issueviewfavorite",
name="created_by",
),
migrations.RemoveField(
model_name="issueviewfavorite",
name="project",
),
migrations.RemoveField(
model_name="issueviewfavorite",
name="updated_by",
),
migrations.RemoveField(
model_name="issueviewfavorite",
name="user",
),
migrations.RemoveField(
model_name="issueviewfavorite",
name="view",
),
migrations.RemoveField(
model_name="issueviewfavorite",
name="workspace",
),
migrations.AlterUniqueTogether(
name="modulefavorite",
unique_together=None,
),
migrations.RemoveField(
model_name="modulefavorite",
name="created_by",
),
migrations.RemoveField(
model_name="modulefavorite",
name="module",
),
migrations.RemoveField(
model_name="modulefavorite",
name="project",
),
migrations.RemoveField(
model_name="modulefavorite",
name="updated_by",
),
migrations.RemoveField(
model_name="modulefavorite",
name="user",
),
migrations.RemoveField(
model_name="modulefavorite",
name="workspace",
),
migrations.RemoveField(
model_name="pageblock",
name="created_by",
),
migrations.RemoveField(
model_name="pageblock",
name="issue",
),
migrations.RemoveField(
model_name="pageblock",
name="page",
),
migrations.RemoveField(
model_name="pageblock",
name="project",
),
migrations.RemoveField(
model_name="pageblock",
name="updated_by",
),
migrations.RemoveField(
model_name="pageblock",
name="workspace",
),
migrations.AlterUniqueTogether(
name="pagefavorite",
unique_together=None,
),
migrations.RemoveField(
model_name="pagefavorite",
name="created_by",
),
migrations.RemoveField(
model_name="pagefavorite",
name="page",
),
migrations.RemoveField(
model_name="pagefavorite",
name="project",
),
migrations.RemoveField(
model_name="pagefavorite",
name="updated_by",
),
migrations.RemoveField(
model_name="pagefavorite",
name="user",
),
migrations.RemoveField(
model_name="pagefavorite",
name="workspace",
),
migrations.AlterUniqueTogether(
name="projectfavorite",
unique_together=None,
),
migrations.RemoveField(
model_name="projectfavorite",
name="created_by",
),
migrations.RemoveField(
model_name="projectfavorite",
name="project",
),
migrations.RemoveField(
model_name="projectfavorite",
name="updated_by",
),
migrations.RemoveField(
model_name="projectfavorite",
name="user",
),
migrations.RemoveField(
model_name="projectfavorite",
name="workspace",
),
migrations.AddField(
model_name="issuetype",
name="external_id",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.AddField(
model_name="issuetype",
name="external_source",
field=models.CharField(blank=True, max_length=255, null=True),
),
migrations.DeleteModel(
name="CycleFavorite",
),
migrations.DeleteModel(
name="GlobalView",
),
migrations.DeleteModel(
name="IssueViewFavorite",
),
migrations.DeleteModel(
name="ModuleFavorite",
),
migrations.DeleteModel(
name="PageBlock",
),
migrations.DeleteModel(
name="PageFavorite",
),
migrations.DeleteModel(
name="ProjectFavorite",
),
]

View File

@@ -0,0 +1,63 @@
# Generated by Django 4.2.15 on 2024-10-22 08:00
from django.db import migrations, models
import django.db.models.deletion
import django.db.models.manager
class Migration(migrations.Migration):
dependencies = [
("db", "0081_remove_globalview_created_by_and_more"),
]
operations = [
migrations.AlterModelManagers(
name="issue",
managers=[
("issue_objects", django.db.models.manager.Manager()),
],
),
migrations.AlterField(
model_name="cycleissue",
name="issue",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="issue_cycle",
to="db.issue",
),
),
migrations.AlterField(
model_name="draftissuecycle",
name="draft_issue",
field=models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="draft_issue_cycle",
to="db.draftissue",
),
),
migrations.AlterUniqueTogether(
name="cycleissue",
unique_together={("issue", "cycle", "deleted_at")},
),
migrations.AlterUniqueTogether(
name="draftissuecycle",
unique_together={("draft_issue", "cycle", "deleted_at")},
),
migrations.AddConstraint(
model_name="cycleissue",
constraint=models.UniqueConstraint(
condition=models.Q(("deleted_at__isnull", True)),
fields=("cycle", "issue"),
name="cycle_issue_when_deleted_at_null",
),
),
migrations.AddConstraint(
model_name="draftissuecycle",
constraint=models.UniqueConstraint(
condition=models.Q(("deleted_at__isnull", True)),
fields=("draft_issue", "cycle"),
name="draft_issue_cycle_when_deleted_at_null",
),
),
]

View File

@@ -0,0 +1,874 @@
# Generated by Django 4.2.15 on 2024-11-01 17:02
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
("db", "0082_alter_issue_managers_alter_cycleissue_issue_and_more"),
]
operations = [
migrations.CreateModel(
name="Device",
fields=[
(
"created_at",
models.DateTimeField(
auto_now_add=True, verbose_name="Created At"
),
),
(
"updated_at",
models.DateTimeField(
auto_now=True, verbose_name="Last Modified At"
),
),
(
"deleted_at",
models.DateTimeField(
blank=True, null=True, verbose_name="Deleted At"
),
),
(
"id",
models.UUIDField(
db_index=True,
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
unique=True,
),
),
(
"device_id",
models.CharField(blank=True, max_length=255, null=True),
),
(
"device_type",
models.CharField(
choices=[
("ANDROID", "Android"),
("IOS", "iOS"),
("WEB", "Web"),
("DESKTOP", "Desktop"),
],
max_length=255,
),
),
(
"push_token",
models.CharField(blank=True, max_length=255, null=True),
),
("is_active", models.BooleanField(default=True)),
(
"created_by",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="%(class)s_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"updated_by",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="%(class)s_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Modified By",
),
),
(
"user",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="devices",
to=settings.AUTH_USER_MODEL,
),
),
],
options={
"verbose_name": "Device",
"verbose_name_plural": "Devices",
"db_table": "devices",
},
),
migrations.AddField(
model_name="issuetype",
name="is_epic",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="workspace",
name="timezone",
field=models.CharField(
choices=[
("Africa/Abidjan", "Africa/Abidjan"),
("Africa/Accra", "Africa/Accra"),
("Africa/Addis_Ababa", "Africa/Addis_Ababa"),
("Africa/Algiers", "Africa/Algiers"),
("Africa/Asmara", "Africa/Asmara"),
("Africa/Asmera", "Africa/Asmera"),
("Africa/Bamako", "Africa/Bamako"),
("Africa/Bangui", "Africa/Bangui"),
("Africa/Banjul", "Africa/Banjul"),
("Africa/Bissau", "Africa/Bissau"),
("Africa/Blantyre", "Africa/Blantyre"),
("Africa/Brazzaville", "Africa/Brazzaville"),
("Africa/Bujumbura", "Africa/Bujumbura"),
("Africa/Cairo", "Africa/Cairo"),
("Africa/Casablanca", "Africa/Casablanca"),
("Africa/Ceuta", "Africa/Ceuta"),
("Africa/Conakry", "Africa/Conakry"),
("Africa/Dakar", "Africa/Dakar"),
("Africa/Dar_es_Salaam", "Africa/Dar_es_Salaam"),
("Africa/Djibouti", "Africa/Djibouti"),
("Africa/Douala", "Africa/Douala"),
("Africa/El_Aaiun", "Africa/El_Aaiun"),
("Africa/Freetown", "Africa/Freetown"),
("Africa/Gaborone", "Africa/Gaborone"),
("Africa/Harare", "Africa/Harare"),
("Africa/Johannesburg", "Africa/Johannesburg"),
("Africa/Juba", "Africa/Juba"),
("Africa/Kampala", "Africa/Kampala"),
("Africa/Khartoum", "Africa/Khartoum"),
("Africa/Kigali", "Africa/Kigali"),
("Africa/Kinshasa", "Africa/Kinshasa"),
("Africa/Lagos", "Africa/Lagos"),
("Africa/Libreville", "Africa/Libreville"),
("Africa/Lome", "Africa/Lome"),
("Africa/Luanda", "Africa/Luanda"),
("Africa/Lubumbashi", "Africa/Lubumbashi"),
("Africa/Lusaka", "Africa/Lusaka"),
("Africa/Malabo", "Africa/Malabo"),
("Africa/Maputo", "Africa/Maputo"),
("Africa/Maseru", "Africa/Maseru"),
("Africa/Mbabane", "Africa/Mbabane"),
("Africa/Mogadishu", "Africa/Mogadishu"),
("Africa/Monrovia", "Africa/Monrovia"),
("Africa/Nairobi", "Africa/Nairobi"),
("Africa/Ndjamena", "Africa/Ndjamena"),
("Africa/Niamey", "Africa/Niamey"),
("Africa/Nouakchott", "Africa/Nouakchott"),
("Africa/Ouagadougou", "Africa/Ouagadougou"),
("Africa/Porto-Novo", "Africa/Porto-Novo"),
("Africa/Sao_Tome", "Africa/Sao_Tome"),
("Africa/Timbuktu", "Africa/Timbuktu"),
("Africa/Tripoli", "Africa/Tripoli"),
("Africa/Tunis", "Africa/Tunis"),
("Africa/Windhoek", "Africa/Windhoek"),
("America/Adak", "America/Adak"),
("America/Anchorage", "America/Anchorage"),
("America/Anguilla", "America/Anguilla"),
("America/Antigua", "America/Antigua"),
("America/Araguaina", "America/Araguaina"),
(
"America/Argentina/Buenos_Aires",
"America/Argentina/Buenos_Aires",
),
(
"America/Argentina/Catamarca",
"America/Argentina/Catamarca",
),
(
"America/Argentina/ComodRivadavia",
"America/Argentina/ComodRivadavia",
),
("America/Argentina/Cordoba", "America/Argentina/Cordoba"),
("America/Argentina/Jujuy", "America/Argentina/Jujuy"),
(
"America/Argentina/La_Rioja",
"America/Argentina/La_Rioja",
),
("America/Argentina/Mendoza", "America/Argentina/Mendoza"),
(
"America/Argentina/Rio_Gallegos",
"America/Argentina/Rio_Gallegos",
),
("America/Argentina/Salta", "America/Argentina/Salta"),
(
"America/Argentina/San_Juan",
"America/Argentina/San_Juan",
),
(
"America/Argentina/San_Luis",
"America/Argentina/San_Luis",
),
("America/Argentina/Tucuman", "America/Argentina/Tucuman"),
("America/Argentina/Ushuaia", "America/Argentina/Ushuaia"),
("America/Aruba", "America/Aruba"),
("America/Asuncion", "America/Asuncion"),
("America/Atikokan", "America/Atikokan"),
("America/Atka", "America/Atka"),
("America/Bahia", "America/Bahia"),
("America/Bahia_Banderas", "America/Bahia_Banderas"),
("America/Barbados", "America/Barbados"),
("America/Belem", "America/Belem"),
("America/Belize", "America/Belize"),
("America/Blanc-Sablon", "America/Blanc-Sablon"),
("America/Boa_Vista", "America/Boa_Vista"),
("America/Bogota", "America/Bogota"),
("America/Boise", "America/Boise"),
("America/Buenos_Aires", "America/Buenos_Aires"),
("America/Cambridge_Bay", "America/Cambridge_Bay"),
("America/Campo_Grande", "America/Campo_Grande"),
("America/Cancun", "America/Cancun"),
("America/Caracas", "America/Caracas"),
("America/Catamarca", "America/Catamarca"),
("America/Cayenne", "America/Cayenne"),
("America/Cayman", "America/Cayman"),
("America/Chicago", "America/Chicago"),
("America/Chihuahua", "America/Chihuahua"),
("America/Ciudad_Juarez", "America/Ciudad_Juarez"),
("America/Coral_Harbour", "America/Coral_Harbour"),
("America/Cordoba", "America/Cordoba"),
("America/Costa_Rica", "America/Costa_Rica"),
("America/Creston", "America/Creston"),
("America/Cuiaba", "America/Cuiaba"),
("America/Curacao", "America/Curacao"),
("America/Danmarkshavn", "America/Danmarkshavn"),
("America/Dawson", "America/Dawson"),
("America/Dawson_Creek", "America/Dawson_Creek"),
("America/Denver", "America/Denver"),
("America/Detroit", "America/Detroit"),
("America/Dominica", "America/Dominica"),
("America/Edmonton", "America/Edmonton"),
("America/Eirunepe", "America/Eirunepe"),
("America/El_Salvador", "America/El_Salvador"),
("America/Ensenada", "America/Ensenada"),
("America/Fort_Nelson", "America/Fort_Nelson"),
("America/Fort_Wayne", "America/Fort_Wayne"),
("America/Fortaleza", "America/Fortaleza"),
("America/Glace_Bay", "America/Glace_Bay"),
("America/Godthab", "America/Godthab"),
("America/Goose_Bay", "America/Goose_Bay"),
("America/Grand_Turk", "America/Grand_Turk"),
("America/Grenada", "America/Grenada"),
("America/Guadeloupe", "America/Guadeloupe"),
("America/Guatemala", "America/Guatemala"),
("America/Guayaquil", "America/Guayaquil"),
("America/Guyana", "America/Guyana"),
("America/Halifax", "America/Halifax"),
("America/Havana", "America/Havana"),
("America/Hermosillo", "America/Hermosillo"),
(
"America/Indiana/Indianapolis",
"America/Indiana/Indianapolis",
),
("America/Indiana/Knox", "America/Indiana/Knox"),
("America/Indiana/Marengo", "America/Indiana/Marengo"),
(
"America/Indiana/Petersburg",
"America/Indiana/Petersburg",
),
("America/Indiana/Tell_City", "America/Indiana/Tell_City"),
("America/Indiana/Vevay", "America/Indiana/Vevay"),
("America/Indiana/Vincennes", "America/Indiana/Vincennes"),
("America/Indiana/Winamac", "America/Indiana/Winamac"),
("America/Indianapolis", "America/Indianapolis"),
("America/Inuvik", "America/Inuvik"),
("America/Iqaluit", "America/Iqaluit"),
("America/Jamaica", "America/Jamaica"),
("America/Jujuy", "America/Jujuy"),
("America/Juneau", "America/Juneau"),
(
"America/Kentucky/Louisville",
"America/Kentucky/Louisville",
),
(
"America/Kentucky/Monticello",
"America/Kentucky/Monticello",
),
("America/Knox_IN", "America/Knox_IN"),
("America/Kralendijk", "America/Kralendijk"),
("America/La_Paz", "America/La_Paz"),
("America/Lima", "America/Lima"),
("America/Los_Angeles", "America/Los_Angeles"),
("America/Louisville", "America/Louisville"),
("America/Lower_Princes", "America/Lower_Princes"),
("America/Maceio", "America/Maceio"),
("America/Managua", "America/Managua"),
("America/Manaus", "America/Manaus"),
("America/Marigot", "America/Marigot"),
("America/Martinique", "America/Martinique"),
("America/Matamoros", "America/Matamoros"),
("America/Mazatlan", "America/Mazatlan"),
("America/Mendoza", "America/Mendoza"),
("America/Menominee", "America/Menominee"),
("America/Merida", "America/Merida"),
("America/Metlakatla", "America/Metlakatla"),
("America/Mexico_City", "America/Mexico_City"),
("America/Miquelon", "America/Miquelon"),
("America/Moncton", "America/Moncton"),
("America/Monterrey", "America/Monterrey"),
("America/Montevideo", "America/Montevideo"),
("America/Montreal", "America/Montreal"),
("America/Montserrat", "America/Montserrat"),
("America/Nassau", "America/Nassau"),
("America/New_York", "America/New_York"),
("America/Nipigon", "America/Nipigon"),
("America/Nome", "America/Nome"),
("America/Noronha", "America/Noronha"),
(
"America/North_Dakota/Beulah",
"America/North_Dakota/Beulah",
),
(
"America/North_Dakota/Center",
"America/North_Dakota/Center",
),
(
"America/North_Dakota/New_Salem",
"America/North_Dakota/New_Salem",
),
("America/Nuuk", "America/Nuuk"),
("America/Ojinaga", "America/Ojinaga"),
("America/Panama", "America/Panama"),
("America/Pangnirtung", "America/Pangnirtung"),
("America/Paramaribo", "America/Paramaribo"),
("America/Phoenix", "America/Phoenix"),
("America/Port-au-Prince", "America/Port-au-Prince"),
("America/Port_of_Spain", "America/Port_of_Spain"),
("America/Porto_Acre", "America/Porto_Acre"),
("America/Porto_Velho", "America/Porto_Velho"),
("America/Puerto_Rico", "America/Puerto_Rico"),
("America/Punta_Arenas", "America/Punta_Arenas"),
("America/Rainy_River", "America/Rainy_River"),
("America/Rankin_Inlet", "America/Rankin_Inlet"),
("America/Recife", "America/Recife"),
("America/Regina", "America/Regina"),
("America/Resolute", "America/Resolute"),
("America/Rio_Branco", "America/Rio_Branco"),
("America/Rosario", "America/Rosario"),
("America/Santa_Isabel", "America/Santa_Isabel"),
("America/Santarem", "America/Santarem"),
("America/Santiago", "America/Santiago"),
("America/Santo_Domingo", "America/Santo_Domingo"),
("America/Sao_Paulo", "America/Sao_Paulo"),
("America/Scoresbysund", "America/Scoresbysund"),
("America/Shiprock", "America/Shiprock"),
("America/Sitka", "America/Sitka"),
("America/St_Barthelemy", "America/St_Barthelemy"),
("America/St_Johns", "America/St_Johns"),
("America/St_Kitts", "America/St_Kitts"),
("America/St_Lucia", "America/St_Lucia"),
("America/St_Thomas", "America/St_Thomas"),
("America/St_Vincent", "America/St_Vincent"),
("America/Swift_Current", "America/Swift_Current"),
("America/Tegucigalpa", "America/Tegucigalpa"),
("America/Thule", "America/Thule"),
("America/Thunder_Bay", "America/Thunder_Bay"),
("America/Tijuana", "America/Tijuana"),
("America/Toronto", "America/Toronto"),
("America/Tortola", "America/Tortola"),
("America/Vancouver", "America/Vancouver"),
("America/Virgin", "America/Virgin"),
("America/Whitehorse", "America/Whitehorse"),
("America/Winnipeg", "America/Winnipeg"),
("America/Yakutat", "America/Yakutat"),
("America/Yellowknife", "America/Yellowknife"),
("Antarctica/Casey", "Antarctica/Casey"),
("Antarctica/Davis", "Antarctica/Davis"),
("Antarctica/DumontDUrville", "Antarctica/DumontDUrville"),
("Antarctica/Macquarie", "Antarctica/Macquarie"),
("Antarctica/Mawson", "Antarctica/Mawson"),
("Antarctica/McMurdo", "Antarctica/McMurdo"),
("Antarctica/Palmer", "Antarctica/Palmer"),
("Antarctica/Rothera", "Antarctica/Rothera"),
("Antarctica/South_Pole", "Antarctica/South_Pole"),
("Antarctica/Syowa", "Antarctica/Syowa"),
("Antarctica/Troll", "Antarctica/Troll"),
("Antarctica/Vostok", "Antarctica/Vostok"),
("Arctic/Longyearbyen", "Arctic/Longyearbyen"),
("Asia/Aden", "Asia/Aden"),
("Asia/Almaty", "Asia/Almaty"),
("Asia/Amman", "Asia/Amman"),
("Asia/Anadyr", "Asia/Anadyr"),
("Asia/Aqtau", "Asia/Aqtau"),
("Asia/Aqtobe", "Asia/Aqtobe"),
("Asia/Ashgabat", "Asia/Ashgabat"),
("Asia/Ashkhabad", "Asia/Ashkhabad"),
("Asia/Atyrau", "Asia/Atyrau"),
("Asia/Baghdad", "Asia/Baghdad"),
("Asia/Bahrain", "Asia/Bahrain"),
("Asia/Baku", "Asia/Baku"),
("Asia/Bangkok", "Asia/Bangkok"),
("Asia/Barnaul", "Asia/Barnaul"),
("Asia/Beirut", "Asia/Beirut"),
("Asia/Bishkek", "Asia/Bishkek"),
("Asia/Brunei", "Asia/Brunei"),
("Asia/Calcutta", "Asia/Calcutta"),
("Asia/Chita", "Asia/Chita"),
("Asia/Choibalsan", "Asia/Choibalsan"),
("Asia/Chongqing", "Asia/Chongqing"),
("Asia/Chungking", "Asia/Chungking"),
("Asia/Colombo", "Asia/Colombo"),
("Asia/Dacca", "Asia/Dacca"),
("Asia/Damascus", "Asia/Damascus"),
("Asia/Dhaka", "Asia/Dhaka"),
("Asia/Dili", "Asia/Dili"),
("Asia/Dubai", "Asia/Dubai"),
("Asia/Dushanbe", "Asia/Dushanbe"),
("Asia/Famagusta", "Asia/Famagusta"),
("Asia/Gaza", "Asia/Gaza"),
("Asia/Harbin", "Asia/Harbin"),
("Asia/Hebron", "Asia/Hebron"),
("Asia/Ho_Chi_Minh", "Asia/Ho_Chi_Minh"),
("Asia/Hong_Kong", "Asia/Hong_Kong"),
("Asia/Hovd", "Asia/Hovd"),
("Asia/Irkutsk", "Asia/Irkutsk"),
("Asia/Istanbul", "Asia/Istanbul"),
("Asia/Jakarta", "Asia/Jakarta"),
("Asia/Jayapura", "Asia/Jayapura"),
("Asia/Jerusalem", "Asia/Jerusalem"),
("Asia/Kabul", "Asia/Kabul"),
("Asia/Kamchatka", "Asia/Kamchatka"),
("Asia/Karachi", "Asia/Karachi"),
("Asia/Kashgar", "Asia/Kashgar"),
("Asia/Kathmandu", "Asia/Kathmandu"),
("Asia/Katmandu", "Asia/Katmandu"),
("Asia/Khandyga", "Asia/Khandyga"),
("Asia/Kolkata", "Asia/Kolkata"),
("Asia/Krasnoyarsk", "Asia/Krasnoyarsk"),
("Asia/Kuala_Lumpur", "Asia/Kuala_Lumpur"),
("Asia/Kuching", "Asia/Kuching"),
("Asia/Kuwait", "Asia/Kuwait"),
("Asia/Macao", "Asia/Macao"),
("Asia/Macau", "Asia/Macau"),
("Asia/Magadan", "Asia/Magadan"),
("Asia/Makassar", "Asia/Makassar"),
("Asia/Manila", "Asia/Manila"),
("Asia/Muscat", "Asia/Muscat"),
("Asia/Nicosia", "Asia/Nicosia"),
("Asia/Novokuznetsk", "Asia/Novokuznetsk"),
("Asia/Novosibirsk", "Asia/Novosibirsk"),
("Asia/Omsk", "Asia/Omsk"),
("Asia/Oral", "Asia/Oral"),
("Asia/Phnom_Penh", "Asia/Phnom_Penh"),
("Asia/Pontianak", "Asia/Pontianak"),
("Asia/Pyongyang", "Asia/Pyongyang"),
("Asia/Qatar", "Asia/Qatar"),
("Asia/Qostanay", "Asia/Qostanay"),
("Asia/Qyzylorda", "Asia/Qyzylorda"),
("Asia/Rangoon", "Asia/Rangoon"),
("Asia/Riyadh", "Asia/Riyadh"),
("Asia/Saigon", "Asia/Saigon"),
("Asia/Sakhalin", "Asia/Sakhalin"),
("Asia/Samarkand", "Asia/Samarkand"),
("Asia/Seoul", "Asia/Seoul"),
("Asia/Shanghai", "Asia/Shanghai"),
("Asia/Singapore", "Asia/Singapore"),
("Asia/Srednekolymsk", "Asia/Srednekolymsk"),
("Asia/Taipei", "Asia/Taipei"),
("Asia/Tashkent", "Asia/Tashkent"),
("Asia/Tbilisi", "Asia/Tbilisi"),
("Asia/Tehran", "Asia/Tehran"),
("Asia/Tel_Aviv", "Asia/Tel_Aviv"),
("Asia/Thimbu", "Asia/Thimbu"),
("Asia/Thimphu", "Asia/Thimphu"),
("Asia/Tokyo", "Asia/Tokyo"),
("Asia/Tomsk", "Asia/Tomsk"),
("Asia/Ujung_Pandang", "Asia/Ujung_Pandang"),
("Asia/Ulaanbaatar", "Asia/Ulaanbaatar"),
("Asia/Ulan_Bator", "Asia/Ulan_Bator"),
("Asia/Urumqi", "Asia/Urumqi"),
("Asia/Ust-Nera", "Asia/Ust-Nera"),
("Asia/Vientiane", "Asia/Vientiane"),
("Asia/Vladivostok", "Asia/Vladivostok"),
("Asia/Yakutsk", "Asia/Yakutsk"),
("Asia/Yangon", "Asia/Yangon"),
("Asia/Yekaterinburg", "Asia/Yekaterinburg"),
("Asia/Yerevan", "Asia/Yerevan"),
("Atlantic/Azores", "Atlantic/Azores"),
("Atlantic/Bermuda", "Atlantic/Bermuda"),
("Atlantic/Canary", "Atlantic/Canary"),
("Atlantic/Cape_Verde", "Atlantic/Cape_Verde"),
("Atlantic/Faeroe", "Atlantic/Faeroe"),
("Atlantic/Faroe", "Atlantic/Faroe"),
("Atlantic/Jan_Mayen", "Atlantic/Jan_Mayen"),
("Atlantic/Madeira", "Atlantic/Madeira"),
("Atlantic/Reykjavik", "Atlantic/Reykjavik"),
("Atlantic/South_Georgia", "Atlantic/South_Georgia"),
("Atlantic/St_Helena", "Atlantic/St_Helena"),
("Atlantic/Stanley", "Atlantic/Stanley"),
("Australia/ACT", "Australia/ACT"),
("Australia/Adelaide", "Australia/Adelaide"),
("Australia/Brisbane", "Australia/Brisbane"),
("Australia/Broken_Hill", "Australia/Broken_Hill"),
("Australia/Canberra", "Australia/Canberra"),
("Australia/Currie", "Australia/Currie"),
("Australia/Darwin", "Australia/Darwin"),
("Australia/Eucla", "Australia/Eucla"),
("Australia/Hobart", "Australia/Hobart"),
("Australia/LHI", "Australia/LHI"),
("Australia/Lindeman", "Australia/Lindeman"),
("Australia/Lord_Howe", "Australia/Lord_Howe"),
("Australia/Melbourne", "Australia/Melbourne"),
("Australia/NSW", "Australia/NSW"),
("Australia/North", "Australia/North"),
("Australia/Perth", "Australia/Perth"),
("Australia/Queensland", "Australia/Queensland"),
("Australia/South", "Australia/South"),
("Australia/Sydney", "Australia/Sydney"),
("Australia/Tasmania", "Australia/Tasmania"),
("Australia/Victoria", "Australia/Victoria"),
("Australia/West", "Australia/West"),
("Australia/Yancowinna", "Australia/Yancowinna"),
("Brazil/Acre", "Brazil/Acre"),
("Brazil/DeNoronha", "Brazil/DeNoronha"),
("Brazil/East", "Brazil/East"),
("Brazil/West", "Brazil/West"),
("CET", "CET"),
("CST6CDT", "CST6CDT"),
("Canada/Atlantic", "Canada/Atlantic"),
("Canada/Central", "Canada/Central"),
("Canada/Eastern", "Canada/Eastern"),
("Canada/Mountain", "Canada/Mountain"),
("Canada/Newfoundland", "Canada/Newfoundland"),
("Canada/Pacific", "Canada/Pacific"),
("Canada/Saskatchewan", "Canada/Saskatchewan"),
("Canada/Yukon", "Canada/Yukon"),
("Chile/Continental", "Chile/Continental"),
("Chile/EasterIsland", "Chile/EasterIsland"),
("Cuba", "Cuba"),
("EET", "EET"),
("EST", "EST"),
("EST5EDT", "EST5EDT"),
("Egypt", "Egypt"),
("Eire", "Eire"),
("Etc/GMT", "Etc/GMT"),
("Etc/GMT+0", "Etc/GMT+0"),
("Etc/GMT+1", "Etc/GMT+1"),
("Etc/GMT+10", "Etc/GMT+10"),
("Etc/GMT+11", "Etc/GMT+11"),
("Etc/GMT+12", "Etc/GMT+12"),
("Etc/GMT+2", "Etc/GMT+2"),
("Etc/GMT+3", "Etc/GMT+3"),
("Etc/GMT+4", "Etc/GMT+4"),
("Etc/GMT+5", "Etc/GMT+5"),
("Etc/GMT+6", "Etc/GMT+6"),
("Etc/GMT+7", "Etc/GMT+7"),
("Etc/GMT+8", "Etc/GMT+8"),
("Etc/GMT+9", "Etc/GMT+9"),
("Etc/GMT-0", "Etc/GMT-0"),
("Etc/GMT-1", "Etc/GMT-1"),
("Etc/GMT-10", "Etc/GMT-10"),
("Etc/GMT-11", "Etc/GMT-11"),
("Etc/GMT-12", "Etc/GMT-12"),
("Etc/GMT-13", "Etc/GMT-13"),
("Etc/GMT-14", "Etc/GMT-14"),
("Etc/GMT-2", "Etc/GMT-2"),
("Etc/GMT-3", "Etc/GMT-3"),
("Etc/GMT-4", "Etc/GMT-4"),
("Etc/GMT-5", "Etc/GMT-5"),
("Etc/GMT-6", "Etc/GMT-6"),
("Etc/GMT-7", "Etc/GMT-7"),
("Etc/GMT-8", "Etc/GMT-8"),
("Etc/GMT-9", "Etc/GMT-9"),
("Etc/GMT0", "Etc/GMT0"),
("Etc/Greenwich", "Etc/Greenwich"),
("Etc/UCT", "Etc/UCT"),
("Etc/UTC", "Etc/UTC"),
("Etc/Universal", "Etc/Universal"),
("Etc/Zulu", "Etc/Zulu"),
("Europe/Amsterdam", "Europe/Amsterdam"),
("Europe/Andorra", "Europe/Andorra"),
("Europe/Astrakhan", "Europe/Astrakhan"),
("Europe/Athens", "Europe/Athens"),
("Europe/Belfast", "Europe/Belfast"),
("Europe/Belgrade", "Europe/Belgrade"),
("Europe/Berlin", "Europe/Berlin"),
("Europe/Bratislava", "Europe/Bratislava"),
("Europe/Brussels", "Europe/Brussels"),
("Europe/Bucharest", "Europe/Bucharest"),
("Europe/Budapest", "Europe/Budapest"),
("Europe/Busingen", "Europe/Busingen"),
("Europe/Chisinau", "Europe/Chisinau"),
("Europe/Copenhagen", "Europe/Copenhagen"),
("Europe/Dublin", "Europe/Dublin"),
("Europe/Gibraltar", "Europe/Gibraltar"),
("Europe/Guernsey", "Europe/Guernsey"),
("Europe/Helsinki", "Europe/Helsinki"),
("Europe/Isle_of_Man", "Europe/Isle_of_Man"),
("Europe/Istanbul", "Europe/Istanbul"),
("Europe/Jersey", "Europe/Jersey"),
("Europe/Kaliningrad", "Europe/Kaliningrad"),
("Europe/Kiev", "Europe/Kiev"),
("Europe/Kirov", "Europe/Kirov"),
("Europe/Kyiv", "Europe/Kyiv"),
("Europe/Lisbon", "Europe/Lisbon"),
("Europe/Ljubljana", "Europe/Ljubljana"),
("Europe/London", "Europe/London"),
("Europe/Luxembourg", "Europe/Luxembourg"),
("Europe/Madrid", "Europe/Madrid"),
("Europe/Malta", "Europe/Malta"),
("Europe/Mariehamn", "Europe/Mariehamn"),
("Europe/Minsk", "Europe/Minsk"),
("Europe/Monaco", "Europe/Monaco"),
("Europe/Moscow", "Europe/Moscow"),
("Europe/Nicosia", "Europe/Nicosia"),
("Europe/Oslo", "Europe/Oslo"),
("Europe/Paris", "Europe/Paris"),
("Europe/Podgorica", "Europe/Podgorica"),
("Europe/Prague", "Europe/Prague"),
("Europe/Riga", "Europe/Riga"),
("Europe/Rome", "Europe/Rome"),
("Europe/Samara", "Europe/Samara"),
("Europe/San_Marino", "Europe/San_Marino"),
("Europe/Sarajevo", "Europe/Sarajevo"),
("Europe/Saratov", "Europe/Saratov"),
("Europe/Simferopol", "Europe/Simferopol"),
("Europe/Skopje", "Europe/Skopje"),
("Europe/Sofia", "Europe/Sofia"),
("Europe/Stockholm", "Europe/Stockholm"),
("Europe/Tallinn", "Europe/Tallinn"),
("Europe/Tirane", "Europe/Tirane"),
("Europe/Tiraspol", "Europe/Tiraspol"),
("Europe/Ulyanovsk", "Europe/Ulyanovsk"),
("Europe/Uzhgorod", "Europe/Uzhgorod"),
("Europe/Vaduz", "Europe/Vaduz"),
("Europe/Vatican", "Europe/Vatican"),
("Europe/Vienna", "Europe/Vienna"),
("Europe/Vilnius", "Europe/Vilnius"),
("Europe/Volgograd", "Europe/Volgograd"),
("Europe/Warsaw", "Europe/Warsaw"),
("Europe/Zagreb", "Europe/Zagreb"),
("Europe/Zaporozhye", "Europe/Zaporozhye"),
("Europe/Zurich", "Europe/Zurich"),
("GB", "GB"),
("GB-Eire", "GB-Eire"),
("GMT", "GMT"),
("GMT+0", "GMT+0"),
("GMT-0", "GMT-0"),
("GMT0", "GMT0"),
("Greenwich", "Greenwich"),
("HST", "HST"),
("Hongkong", "Hongkong"),
("Iceland", "Iceland"),
("Indian/Antananarivo", "Indian/Antananarivo"),
("Indian/Chagos", "Indian/Chagos"),
("Indian/Christmas", "Indian/Christmas"),
("Indian/Cocos", "Indian/Cocos"),
("Indian/Comoro", "Indian/Comoro"),
("Indian/Kerguelen", "Indian/Kerguelen"),
("Indian/Mahe", "Indian/Mahe"),
("Indian/Maldives", "Indian/Maldives"),
("Indian/Mauritius", "Indian/Mauritius"),
("Indian/Mayotte", "Indian/Mayotte"),
("Indian/Reunion", "Indian/Reunion"),
("Iran", "Iran"),
("Israel", "Israel"),
("Jamaica", "Jamaica"),
("Japan", "Japan"),
("Kwajalein", "Kwajalein"),
("Libya", "Libya"),
("MET", "MET"),
("MST", "MST"),
("MST7MDT", "MST7MDT"),
("Mexico/BajaNorte", "Mexico/BajaNorte"),
("Mexico/BajaSur", "Mexico/BajaSur"),
("Mexico/General", "Mexico/General"),
("NZ", "NZ"),
("NZ-CHAT", "NZ-CHAT"),
("Navajo", "Navajo"),
("PRC", "PRC"),
("PST8PDT", "PST8PDT"),
("Pacific/Apia", "Pacific/Apia"),
("Pacific/Auckland", "Pacific/Auckland"),
("Pacific/Bougainville", "Pacific/Bougainville"),
("Pacific/Chatham", "Pacific/Chatham"),
("Pacific/Chuuk", "Pacific/Chuuk"),
("Pacific/Easter", "Pacific/Easter"),
("Pacific/Efate", "Pacific/Efate"),
("Pacific/Enderbury", "Pacific/Enderbury"),
("Pacific/Fakaofo", "Pacific/Fakaofo"),
("Pacific/Fiji", "Pacific/Fiji"),
("Pacific/Funafuti", "Pacific/Funafuti"),
("Pacific/Galapagos", "Pacific/Galapagos"),
("Pacific/Gambier", "Pacific/Gambier"),
("Pacific/Guadalcanal", "Pacific/Guadalcanal"),
("Pacific/Guam", "Pacific/Guam"),
("Pacific/Honolulu", "Pacific/Honolulu"),
("Pacific/Johnston", "Pacific/Johnston"),
("Pacific/Kanton", "Pacific/Kanton"),
("Pacific/Kiritimati", "Pacific/Kiritimati"),
("Pacific/Kosrae", "Pacific/Kosrae"),
("Pacific/Kwajalein", "Pacific/Kwajalein"),
("Pacific/Majuro", "Pacific/Majuro"),
("Pacific/Marquesas", "Pacific/Marquesas"),
("Pacific/Midway", "Pacific/Midway"),
("Pacific/Nauru", "Pacific/Nauru"),
("Pacific/Niue", "Pacific/Niue"),
("Pacific/Norfolk", "Pacific/Norfolk"),
("Pacific/Noumea", "Pacific/Noumea"),
("Pacific/Pago_Pago", "Pacific/Pago_Pago"),
("Pacific/Palau", "Pacific/Palau"),
("Pacific/Pitcairn", "Pacific/Pitcairn"),
("Pacific/Pohnpei", "Pacific/Pohnpei"),
("Pacific/Ponape", "Pacific/Ponape"),
("Pacific/Port_Moresby", "Pacific/Port_Moresby"),
("Pacific/Rarotonga", "Pacific/Rarotonga"),
("Pacific/Saipan", "Pacific/Saipan"),
("Pacific/Samoa", "Pacific/Samoa"),
("Pacific/Tahiti", "Pacific/Tahiti"),
("Pacific/Tarawa", "Pacific/Tarawa"),
("Pacific/Tongatapu", "Pacific/Tongatapu"),
("Pacific/Truk", "Pacific/Truk"),
("Pacific/Wake", "Pacific/Wake"),
("Pacific/Wallis", "Pacific/Wallis"),
("Pacific/Yap", "Pacific/Yap"),
("Poland", "Poland"),
("Portugal", "Portugal"),
("ROC", "ROC"),
("ROK", "ROK"),
("Singapore", "Singapore"),
("Turkey", "Turkey"),
("UCT", "UCT"),
("US/Alaska", "US/Alaska"),
("US/Aleutian", "US/Aleutian"),
("US/Arizona", "US/Arizona"),
("US/Central", "US/Central"),
("US/East-Indiana", "US/East-Indiana"),
("US/Eastern", "US/Eastern"),
("US/Hawaii", "US/Hawaii"),
("US/Indiana-Starke", "US/Indiana-Starke"),
("US/Michigan", "US/Michigan"),
("US/Mountain", "US/Mountain"),
("US/Pacific", "US/Pacific"),
("US/Samoa", "US/Samoa"),
("UTC", "UTC"),
("Universal", "Universal"),
("W-SU", "W-SU"),
("WET", "WET"),
("Zulu", "Zulu"),
],
default="UTC",
max_length=255,
),
),
migrations.AlterField(
model_name="issuerelation",
name="relation_type",
field=models.CharField(
choices=[
("duplicate", "Duplicate"),
("relates_to", "Relates To"),
("blocked_by", "Blocked By"),
("start_before", "Start Before"),
("finish_before", "Finish Before"),
],
default="blocked_by",
max_length=20,
verbose_name="Issue Relation Type",
),
),
migrations.AlterField(
model_name="issuetype",
name="level",
field=models.FloatField(default=0),
),
migrations.AlterField(
model_name="label",
name="project",
field=models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.CASCADE,
related_name="project_%(class)s",
to="db.project",
),
),
migrations.CreateModel(
name="DeviceSession",
fields=[
(
"created_at",
models.DateTimeField(
auto_now_add=True, verbose_name="Created At"
),
),
(
"updated_at",
models.DateTimeField(
auto_now=True, verbose_name="Last Modified At"
),
),
(
"deleted_at",
models.DateTimeField(
blank=True, null=True, verbose_name="Deleted At"
),
),
(
"id",
models.UUIDField(
db_index=True,
default=uuid.uuid4,
editable=False,
primary_key=True,
serialize=False,
unique=True,
),
),
("is_active", models.BooleanField(default=True)),
(
"user_agent",
models.CharField(blank=True, max_length=255, null=True),
),
(
"ip_address",
models.GenericIPAddressField(blank=True, null=True),
),
("start_time", models.DateTimeField(auto_now_add=True)),
("end_time", models.DateTimeField(blank=True, null=True)),
(
"created_by",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="%(class)s_created_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Created By",
),
),
(
"device",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="sessions",
to="db.device",
),
),
(
"session",
models.ForeignKey(
on_delete=django.db.models.deletion.CASCADE,
related_name="device_sessions",
to="db.session",
),
),
(
"updated_by",
models.ForeignKey(
null=True,
on_delete=django.db.models.deletion.SET_NULL,
related_name="%(class)s_updated_by",
to=settings.AUTH_USER_MODEL,
verbose_name="Last Modified By",
),
),
],
options={
"verbose_name": "Device Session",
"verbose_name_plural": "Device Sessions",
"db_table": "device_sessions",
},
),
]

View File

@@ -0,0 +1,75 @@
# Generated by Django 4.2.15 on 2024-11-05 07:02
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("db", "0083_device_workspace_timezone_and_more"),
]
operations = [
migrations.RemoveConstraint(
model_name="label",
name="label_unique_name_project_when_deleted_at_null",
),
migrations.AlterUniqueTogether(
name="label",
unique_together=set(),
),
migrations.AddField(
model_name="deployboard",
name="is_disabled",
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name="inboxissue",
name="extra",
field=models.JSONField(default=dict),
),
migrations.AddField(
model_name="inboxissue",
name="source_email",
field=models.TextField(blank=True, null=True),
),
migrations.AddField(
model_name="user",
name="bot_type",
field=models.CharField(
blank=True, max_length=30, null=True, verbose_name="Bot Type"
),
),
migrations.AlterField(
model_name="deployboard",
name="entity_name",
field=models.CharField(blank=True, max_length=30, null=True),
),
migrations.AlterField(
model_name="inboxissue",
name="source",
field=models.CharField(
blank=True, default="IN_APP", max_length=255, null=True
),
),
migrations.AddConstraint(
model_name="label",
constraint=models.UniqueConstraint(
condition=models.Q(
("deleted_at__isnull", True), ("project__isnull", True)
),
fields=("name",),
name="unique_name_when_project_null_and_not_deleted",
),
),
migrations.AddConstraint(
model_name="label",
constraint=models.UniqueConstraint(
condition=models.Q(
("deleted_at__isnull", True), ("project__isnull", False)
),
fields=("project", "name"),
name="unique_project_name_when_not_deleted",
),
),
]

Some files were not shown because too many files have changed in this diff Show More