Compare commits

...

440 Commits

Author SHA1 Message Date
NarayanBavisetti
509618746c Merge branch 'preview' of github.com:makeplane/plane into fix-intake-cycle-module-issue 2024-10-22 18:57:33 +05:30
NarayanBavisetti
c76b247abd chore: changed the annotate for cycle id 2024-10-22 18:32:58 +05:30
Akshita Goyal
d859ab9c39 [WEB-2708] fix: intake module and cycle addition fixed (#5890)
* fix: intake module and cycle addition fixed

* chore: fixed the search endpoint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-22 17:59:07 +05:30
NarayanBavisetti
c462b00889 Merge branch 'preview' of github.com:makeplane/plane into fix-intake-cycle-module-issue 2024-10-22 17:19:11 +05:30
NarayanBavisetti
d6c0e1945e chore: fixed the search endpoint 2024-10-22 16:42:27 +05:30
gakshita
9d096cd1ab fix: intake module and cycle addition fixed 2024-10-22 16:36:18 +05:30
Anmol Singh Bhatia
36b868e375 [WEB-2707] fix: draft issue module update and code refactor (#5889)
* chore: draft issue module update

* chore: code refactor
2024-10-22 16:16:29 +05:30
Aaryan Khandelwal
4c20be6cf2 [PE-68] fix: markdown transformation of mention and custom image components (#5864)
* fix: markdown content for mention and custom image extensions

* style: update issue embed upgrade card

* chore: added string escapes
2024-10-22 14:29:50 +05:30
Bavisetti Narayan
7bf4620bc1 chore: soft deletion of cycle and module (#5884)
* chore: soft deletion of cycle and module

* chore: cycle module soft delete

* chore: added the deletion task

* chore: updated the env example

* chore: cycle issue unique constraints

* chore: udpated the Q operator
2024-10-22 14:21:26 +05:30
Nikhil
00eff43f4d fix: bucket policy script to handle error conditions (#5887)
* fix: bucket policy script to handle error conditions

* dev: handle edge cases
2024-10-22 14:19:43 +05:30
sriram veeraghanta
3d3f1b8f74 fix: typescript version consistency 2024-10-22 14:13:28 +05:30
sriram veeraghanta
b87516b0be chore: fixing inconsistent dependencies across the platform (#5885)
* chore: fixing inconsistent dependies across the platform

* fix: fixing peer dependencies

* chore: yarn lock regeneration
2024-10-22 14:03:34 +05:30
Anmol Singh Bhatia
8a1d3c4cf9 chore: urgent priority icon improvement (#5879) 2024-10-22 13:25:22 +05:30
Akshita Goyal
0f25f39404 WEB-2381 Chore: intake refactor (#5752)
* chore: intake emails and forms

* fix: moved files to ee

* fix: intake form ui

* fix: settings apis integrated

* fix: removed publish api

* fix: removed space app

* fix: lint issue

* fix: removed logs

* fix: removed comment

* fix: improved success image
2024-10-22 12:09:03 +05:30
sriram veeraghanta
fb49644185 fix: renaming the action and formatting 2024-10-21 19:26:16 +05:30
Nikhil
b745a29454 fix: credential sending for file uploads (#5869) 2024-10-21 17:46:46 +05:30
M. Palanikannan
c940a2921e fix: validation of public and private assets (#5878) 2024-10-21 15:59:44 +05:30
Anmol Singh Bhatia
6f8df3279c [WEB-2681] fix: module progress indicator (#5842)
* fix: module progress indicator

* fix: module progress indicator
2024-10-21 15:48:35 +05:30
Prateek Shourya
b833e3b10c [WEB-2674] chore: open parent issues in peek-overview from the parent badge. (#5872)
* [WEB-2674] chore: open parent issues in peek-overview from the parent badge.

* chore: remove `_blank` target from ControlLink.
2024-10-21 14:20:00 +05:30
M. Palanikannan
5a0dc4a65a [PE-69] fix: image restoration fixed for new images in private bucket (#5839)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity

* fix: restore only on error

* fix: restore image by handling it inside the image component

* fix: image restoration fixed and aspect ratio added to old images to stop updates on load

* fix: added back restoring logic for public images

* fix: add conditions

* fix: image attributes types

* fix: return for old images

* fix: remove passive false

* fix: eslint fixes

* fix: stopping infinite loading scenarios while restoring from error
2024-10-21 14:17:05 +05:30
Ketan Sharma
e866571e04 fix backend (#5875) 2024-10-21 13:07:36 +05:30
Bavisetti Narayan
3c3fc7cd6d chore: draft issue listing (#5874) 2024-10-21 13:02:20 +05:30
Bavisetti Narayan
db919420a7 [WEB-2693] chore: removed the deleted cycles from the issue list (#5868)
* chore: added the deleted cycles from list

* chore: removed the extra annotation

* chore: removed the frontend comment
2024-10-18 15:48:34 +05:30
M. Palanikannan
2982cd47a9 fix: remoteImageSrc to come from resolved source (#5867) 2024-10-18 14:21:07 +05:30
M. Palanikannan
81550ab5ef [PE-56] regression: image aspect ratio fix (#5792)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity
2024-10-18 13:40:39 +05:30
Bavisetti Narayan
07402efd79 chore: filtered the deleted labels and modules (#5860) 2024-10-18 13:20:32 +05:30
Prateek Shourya
46302f41bc fix: improvements for project types. (#5857) 2024-10-18 11:08:07 +05:30
Ketan Sharma
9530884c59 fix the logic (#5807) 2024-10-17 17:08:49 +05:30
Prateek Shourya
173b49b4cb [WEB-2431] chore: profile settings page UI improvement (#5838)
* [WEB-2431] chore: timezone and language management.

* chore: remove project level timezone changes.

* chore: minor UI improvement.

* chore: minor improvements
2024-10-17 17:06:22 +05:30
Anmol Singh Bhatia
e581ac890e chore: workspace collaborators improvements (#5846) 2024-10-17 17:05:21 +05:30
Anmol Singh Bhatia
a7b58e4a93 [WEB-2625] chore: workspace favorite and draft improvement (#5855)
* chore: favorite empty state updated

* chore: added draft issue count in workspace members

* chore: workspace draft count improvement

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-17 17:02:25 +05:30
Bavisetti Narayan
d552913171 chore: updated queryset for soft delete (#5844) 2024-10-17 17:01:26 +05:30
Bavisetti Narayan
b6a7e45e8d chore: added draft cycle and module in draft issue (#5854) 2024-10-17 13:35:13 +05:30
Aaryan Khandelwal
6209aeec0b fix: color extension not working on issue description and published page (#5852)
* fix: color extension not working

* chore: update types
2024-10-17 13:26:23 +05:30
Anmol Singh Bhatia
1099c59b83 fix: draft issue empty state flicker (#5848) 2024-10-17 12:55:32 +05:30
Nikhil
9b2ffaaca8 fix: draft issue asset conversion to issue (#5849) 2024-10-17 12:51:13 +05:30
sriram veeraghanta
aa93cca7bf fix: workflow fixes 2024-10-16 21:07:01 +05:30
sriram veeraghanta
1191f74bfe fix: workflow fixes 2024-10-16 20:08:25 +05:30
sriram veeraghanta
fbd1f6334a fix: workflow fixes 2024-10-16 20:05:10 +05:30
Anmol Singh Bhatia
7d36d63eb1 [WEB-2682] fix: delete project mutation and workspace draft header validation (#5843)
* fix: workspace draft header action validation

* fix: delete project mutation
2024-10-16 16:13:26 +05:30
Nikhil
9b85306359 dev: move storage metadata collection to background job (#5818)
* fix: move storage metadata collection to background job

* fix: docker compose and env

* fix: archive endpoint
2024-10-16 13:55:49 +05:30
guru_sainath
cc613e57c9 chore: delete deprecated tables (#5833)
* migration: external source and id for issues

* fix: cleaning up deprecated favorite tables

* fix: removing deprecated models

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-10-16 00:33:57 +05:30
Bavisetti Narayan
6e63af7ca9 [WEB-2626] chore: removed the deleted issue's count (#5837)
* chore: removed the deleted issue count

* chore: added issue manager in burn down
2024-10-16 00:30:44 +05:30
guru_sainath
5f9af92faf fix: attachment_count in issue pagination v2 endpoint (#5840)
* fix: attachemnt_count in the issue pagination v2 endpoint

* fix: string comparision in description check in params
2024-10-15 23:46:57 +05:30
Anmol Singh Bhatia
4e70e894f6 chore: workspace draft issue type (#5836) 2024-10-15 18:59:22 +05:30
Anmol Singh Bhatia
ff090ecf39 fix: workspace draft move to project (#5834) 2024-10-15 17:14:56 +05:30
Akshita Goyal
645a261493 fix: Added a common dropdown component (#5826)
* fix: Added a common dropdown component

* fix: dropdown

* fix: estimate dropdown

* fix: removed consoles
2024-10-15 15:17:46 +05:30
Prateek Shourya
8d0611b2a7 [WEB-2613] chore: open parent and sibling issue in new tab from peek-overview/ issue detail page. (#5819) 2024-10-15 13:37:52 +05:30
Bavisetti Narayan
3d7d3c8af1 [WEB-2631] chore: changed the cascading logic for soft delete (#5829)
* chore: changed the cascading logic for soft delete

* chore: changed the delete key

* chore: added the key on delete in project base model
2024-10-15 13:30:44 +05:30
Prateek Shourya
662b99da92 [WEB-2577] improvement: use common create/update issue modal for accepting intake issues for consistency (#5830)
* [WEB-2577] improvement: use common create/update issue modal for accepting intake issues for consistency

* fix: lint errors.

* chore: minor UX copy fix.

* chore: minor indentation fix.
2024-10-15 13:11:14 +05:30
Prateek Shourya
fa25a816a7 [WEB-2549] chore: ux copy update for project access. (#5831) 2024-10-15 12:57:29 +05:30
Anmol Singh Bhatia
ee823d215e [WEB-2629] chore: workspace draft issue ux copy updated (#5825)
* chore: workspace draft issue ux copy updated

* chore: workspace draft issue ux copy updated
2024-10-14 17:26:54 +05:30
Akshita Goyal
4b450f8173 fix: moved dropdowns to chart component + added pending icon (#5824)
* fix: moved dropdowns to chart component + added pending icon

* fix: copy changes

* fix: review changes
2024-10-14 17:00:58 +05:30
Anmol Singh Bhatia
36229d92e0 [WEB-2629] fix: workspace draft delete and move mutation (#5822)
* fix: mutation fix

* chore: code refactor

* chore: code refactor

* chore: useWorkspaceIssueProperties added
2024-10-14 16:50:19 +05:30
Anmol Singh Bhatia
cb90810d02 chore: double click action added and code refactor (#5821) 2024-10-14 16:46:08 +05:30
Anmol Singh Bhatia
658542cc62 [WEB-2616] fix: issue widget attachment (#5820)
* fix: issue widget attachment

* chore: comment added
2024-10-14 16:32:31 +05:30
Nikhil
701af734cd fix: export for analytics and csv (#5815) 2024-10-13 02:11:32 +05:30
Nikhil
cf53cdf6ba fix: analytics tab for private bucket (#5814) 2024-10-13 01:27:48 +05:30
Nikhil
6490ace7c7 fix: intake issue (#5813) 2024-10-13 00:44:52 +05:30
Nikhil
0ac406e8c7 fix: private bucket (#5812)
* fix: workspace level issue creation

* dev: add draft issue support, fix your work tab and cache invalidation for workspace level logos

* chore: issue description

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-10-13 00:31:28 +05:30
Aaryan Khandelwal
e404450e1a [WEB-310] regression: generate file url function (#5811)
* fix: generate file url function

* chore: remove unused imports

* chore: replace indexOf logix with startsWith
2024-10-12 23:39:50 +05:30
sriram veeraghanta
7cc86ad4c0 chore: removing unused packages 2024-10-12 01:43:22 +05:30
Anmol Singh Bhatia
3acc9ec133 fix: intake exception error (#5810) 2024-10-11 22:01:39 +05:30
Anmol Singh Bhatia
286ab7f650 fix: workspace draft issues count (#5809) 2024-10-11 21:28:05 +05:30
Aaryan Khandelwal
7e334203f1 [WEB-310] dev: private bucket implementation (#5793)
* chore: migrations and backmigration to move attachments to file asset

* chore: move attachments to file assets

* chore: update migration file to include created by and updated by and size

* chore: remove uninmport errors

* chore: make size as float field

* fix: file asset uploads

* chore: asset uploads migration changes

* chore: v2 assets endpoint

* chore: remove unused imports

* chore: issue attachments

* chore: issue attachments

* chore: workspace logo endpoints

* chore: private bucket changes

* chore: user asset endpoint

* chore: add logo_url validation

* chore: cover image urlk

* chore: change asset max length

* chore: pages endpoint

* chore: store the storage_metadata only when none

* chore: attachment asset apis

* chore: update create private bucket

* chore: make bucket private

* chore: fix response of user uploads

* fix: response of user uploads

* fix: job to fix file asset uploads

* fix: user asset endpoints

* chore: avatar for user profile

* chore: external apis user url endpoint

* chore: upload workspace and user asset actions updated

* chore: analytics endpoint

* fix: analytics export

* chore: avatar urls

* chore: update user avatar instances

* chore: avatar urls for assignees and creators

* chore: bucket permission script

* fix: all user avatr instances in the web app

* chore: update project cover image logic

* fix: issue attachment endpoint

* chore: patch endpoint for issue attachment

* chore: attachments

* chore: change attachment storage class

* chore: update issue attachment endpoints

* fix: issue attachment

* chore: update issue attachment implementation

* chore: page asset endpoints

* fix: web build errors

* chore: attachments

* chore: page asset urls

* chore: comment and issue asset endpoints

* chore: asset endpoints

* chore: attachment endpoints

* chore: bulk asset endpoint

* chore: restore endpoint

* chore: project assets endpoints

* chore: asset url

* chore: add delete asset endpoints

* chore: fix asset upload endpoint

* chore: update patch endpoints

* chore: update patch endpoint

* chore: update editor image handling

* chore: asset restore endpoints

* chore: avatar url for space assets

* chore: space app assets migration

* fix: space app urls

* chore: space endpoints

* fix: old editor images rendering logic

* fix: issue archive and attachment activity

* chore: asset deletes

* chore: attachment delete

* fix: issue attachment

* fix: issue attachment get

* chore: cover image url for projects

* chore: remove duplicate py file

* fix: url check function

* chore: chore project cover asset delete

* fix: migrations

* chore: delete migration files

* chore: update bucket

* fix: build errors

* chore: add asset url in intake attachment

* chore: project cover fix

* chore: update next.config

* chore: delete old workspace logos

* chore: workspace assets

* chore: asset get for space

* chore: update project modal

* chore: remove unused imports

* fix: space app editor helper

* chore: update rich-text read-only editor

* chore: create multiple column for entity identifiers

* chore: update migrations

* chore: remove entity identifier

* fix: issue assets

* chore: update maximum file size logic

* chore: update editor max file size logic

* fix: close modal after removing workspace logo

* chore: update uploaded asstes' status post issue creation

* chore: added file size limit to the space app

* dev: add file size limit restriction on all endpoints

* fix: remove old workspace logo and user avatar

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-10-11 20:13:38 +05:30
Anmol Singh Bhatia
c9580ab794 chore workspace draft issue improvements (#5808) 2024-10-11 19:51:38 +05:30
Aaryan Khandelwal
e7065af358 [WEB-2494] dev: custom text color and background color extensions (#5786)
* dev: created custom text color and background color extensions

* chore: update slash commands icon style

* chore: update constants

* chore: update variables css file selectors
2024-10-11 19:11:39 +05:30
Manish Gupta
74695e561a modified the action name (#5806) 2024-10-11 18:05:53 +05:30
Anmol Singh Bhatia
c9dbd1d5d1 [WEB-2388] chore: theme changes and workspace draft issue total count updated (#5805)
* chore: theme changes and total count updated

* chore: code refactor
2024-10-11 17:57:48 +05:30
Manish Gupta
6200890693 fix: updated branch build action with BUILD/RELEASE options (#5803) 2024-10-11 17:25:25 +05:30
guru_sainath
3011ef9da1 build-error: removed store prop from calendar store (#5801) 2024-10-11 15:53:58 +05:30
Anmol Singh Bhatia
bf7b3229d1 [WEB-2388] fix: workspace draft issues (#5800)
* fix: create issue modal handle close

* fix: workspace level draft issue store update

* chore: count added

* chore: added description html in list endpoint

* fix: workspace draft issue mutation

* fix: workspace draft issue empty state and count

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-10-11 15:23:32 +05:30
rahulramesha
2c96e042c6 fix workspace drafts build (#5798) 2024-10-10 22:59:27 +05:30
rahulramesha
9c2278a810 fix workspace draft build (#5795) 2024-10-10 20:50:43 +05:30
Anmol Singh Bhatia
332d2d5c68 [WEB-2388] dev: workspace draft issues (#5772)
* chore: workspace draft page added

* chore: workspace draft issues services added

* chore: workspace draft issue store added

* chore: workspace draft issue filter store added

* chore: issue rendering

* conflicts: resolved merge conflicts

* conflicts: handled draft issue store

* chore: draft issue modal

* chore: code optimisation

* chore: ui changes

* chore: workspace draft store and modal updated

* chore: workspace draft issue component added

* chore: updated store and workflow in draft issues

* chore: updated issue draft store

* chore: updated issue type cleanup in components

* chore: code refactor

* fix: build error

* fix: quick actions

* fix: update mutation

* fix: create update modal

* chore: commented project draft issue code

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 19:12:34 +05:30
guru_sainath
e9158f820f [WEB-2615] fix: module date validation during chart distribution generation (#5791)
* fix: module date validation while generating the chart distribution

* chore: indentation fix

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 18:33:59 +05:30
sriram veeraghanta
1e1733f6db Merge branch 'master' of github.com:makeplane/plane into preview 2024-10-10 17:24:47 +05:30
Bavisetti Narayan
5573d85d80 chore: only admin's can delete a project (#5790) 2024-10-10 17:24:18 +05:30
sriram veeraghanta
c1f881b2d1 Merge branch 'develop' of github.com:makeplane/plane into preview 2024-10-10 15:11:33 +05:30
sriram veeraghanta
9bab108329 Merge pull request #5788 from makeplane/preview
release: v0.23.1
2024-10-10 15:11:04 +05:30
sriram veeraghanta
5f4875cc60 fix: version bump 2024-10-10 15:05:03 +05:30
sriram veeraghanta
0c1c6dee99 fix: adding scheduled tracing 2024-10-10 14:57:42 +05:30
sriram veeraghanta
1639f34db0 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-10 14:07:25 +05:30
Bavisetti Narayan
8a866e440c chore: only admin can changed the project settings (#5766) 2024-10-10 14:06:14 +05:30
Prateek Shourya
7495a7d0cb [WEB-2605] fix: update URL regex pattern to allow complex links. (#5767) 2024-10-10 14:06:14 +05:30
M. Palanikannan
2b1da96c3f fix: drag handle scrolling fixed (#5619)
* fix: drag handle scrolling fixed

* fix: closest scrollable parent found and scrolled

* fix: removed overflow auto from framerenderer

* fix: make dragging dynamic and smoother
2024-10-10 14:06:14 +05:30
Aaryan Khandelwal
daa06f1831 [WEB-2532] fix: custom theme mutation logic (#5685)
* fix: custom theme mutation logic

* chore: update querySelector element
2024-10-10 14:06:14 +05:30
M. Palanikannan
b97fcfb46d fix: show the full screen toolbar in read only instances as well (#5746) 2024-10-10 14:06:14 +05:30
M. Palanikannan
852fc9bac1 [WEB-2603] fix: remove validation of roles from the live server (#5761)
* fix: remove validation of roles from the live server

* chore: remove the service

* fix: remove all validation of authorization

* fix: props updated
2024-10-10 14:06:14 +05:30
Akshita Goyal
55f44e0245 fix: spreadsheet flicker issue (#5769) 2024-10-10 14:06:14 +05:30
Prateek Shourya
8981e52dcc [WEB-2601] improvement: add click to copy issue identifier on peek-overview and issue detail page. (#5760) 2024-10-10 14:06:14 +05:30
Akshita Goyal
d92dbaea72 [WEB-2589] Chore: inbox issue permissions (#5763)
* chore: changed permission in inbox issue

* chore: fixed permissions for intake

* fix: refactoring

* fix: lint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 14:06:14 +05:30
dependabot[bot]
58f3d0a68c chore(deps): bump django in /apiserver/requirements (#5781)
Bumps [django](https://github.com/django/django) from 4.2.15 to 4.2.16.
- [Commits](https://github.com/django/django/compare/4.2.15...4.2.16)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-10 14:06:14 +05:30
Akshita Goyal
45880b3a72 [WEB-2589] Chore: inbox issue permissions (#5763)
* chore: changed permission in inbox issue

* chore: fixed permissions for intake

* fix: refactoring

* fix: lint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-09 17:48:52 +05:30
dependabot[bot]
992adb9794 chore(deps): bump django in /apiserver/requirements (#5781)
Bumps [django](https://github.com/django/django) from 4.2.15 to 4.2.16.
- [Commits](https://github.com/django/django/compare/4.2.15...4.2.16)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-09 17:26:33 +05:30
Akshita Goyal
6d78418e79 fix: create cycle function (#5775)
* fix: create cycle function

* chore: draft and cycle version changes

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-08 20:01:15 +05:30
Prateek Shourya
6e52f1b434 [WEB-2601] improvement: add click to copy issue identifier on peek-overview and issue detail page. (#5760) 2024-10-08 18:43:13 +05:30
Aaryan Khandelwal
c3c1ea727d [WEB-2494] feat: text color and highlight options for all editors (#5653)
* feat: add text color and highlight options to pages

* style: rich text editor floating toolbar

* chore: remove unused function

* refactor: slash command components

* chore: move default text and background options to the top

* fix: sections filtering logic
2024-10-08 18:42:47 +05:30
Aaryan Khandelwal
5afc576dec refactor: export components (#5773) 2024-10-08 18:41:08 +05:30
Ketan Sharma
50ae32f3e1 [WEB-2555] fix: add "mark all as read" in the notifications header (#5770)
* move mark all as read to header and remove it from dropdown

* made recommended changes
2024-10-08 17:13:35 +05:30
Akshita Goyal
0451593057 fix: spreadsheet flicker issue (#5769) 2024-10-08 17:10:16 +05:30
M. Palanikannan
be092ac99f [WEB-2603] fix: remove validation of roles from the live server (#5761)
* fix: remove validation of roles from the live server

* chore: remove the service

* fix: remove all validation of authorization

* fix: props updated
2024-10-08 16:55:26 +05:30
Anmol Singh Bhatia
f73a603226 [WEB-2380] chore: cycle sidebar refactor (#5759)
* chore: cycle sidebar refactor

* chore: code splitting

* chore: code refactor

* chore: code refactor
2024-10-08 16:54:44 +05:30
Aaryan Khandelwal
b27249486a [PE-45] feat: page export as PDF & Markdown (#5705)
* feat: export page as pdf and markdown

* chore: add image conversion logic
2024-10-08 16:54:02 +05:30
Anmol Singh Bhatia
20c9e232e7 chore: IssueParentDetail added to issue peekoverview (#5751) 2024-10-08 16:53:07 +05:30
Bavisetti Narayan
d168fd4bfa [WEB-2388] fix: workspace draft issues migration (#5749)
* fix: workspace draft issues

* chore: changed the timezone key

* chore: migration changes
2024-10-08 16:51:57 +05:30
M. Palanikannan
7317975b04 fix: show the full screen toolbar in read only instances as well (#5746) 2024-10-08 16:50:32 +05:30
Aaryan Khandelwal
39195d0d89 [WEB-2532] fix: custom theme mutation logic (#5685)
* fix: custom theme mutation logic

* chore: update querySelector element
2024-10-08 16:47:16 +05:30
Mihir
6bf0e27b66 [WEB-2433] chore-Update name of the Layout (#5661)
* Updated layout names

* Corrected character casing for titles
2024-10-08 16:44:50 +05:30
M. Palanikannan
5fb7e98b7c fix: drag handle scrolling fixed (#5619)
* fix: drag handle scrolling fixed

* fix: closest scrollable parent found and scrolled

* fix: removed overflow auto from framerenderer

* fix: make dragging dynamic and smoother
2024-10-08 16:44:05 +05:30
Prateek Shourya
328b6961a2 [WEB-2605] fix: update URL regex pattern to allow complex links. (#5767) 2024-10-08 13:20:27 +05:30
Bavisetti Narayan
39eabc28b5 chore: only admin can changed the project settings (#5766) 2024-10-07 20:07:24 +05:30
sriram veeraghanta
d97ca68229 Merge pull request #5764 from makeplane/preview
release: v0.23.0
2024-10-07 18:54:49 +05:30
Bavisetti Narayan
c92fe6191e [WEB-2600] fix: estimate point deletion (#5762)
* chore: only delete the cascade fields

* chore: logged the issue activity
2024-10-07 17:23:37 +05:30
pablohashescobar
7bb04003ea fix: instance trace 2024-10-07 15:56:27 +05:30
sriram veeraghanta
19dab1fad0 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-07 13:20:07 +05:30
M. Palanikannan
5f7b6ecf7f fix: image deletion on submit fixed in comments (#5748)
* fix: image deletion on submit fixed in comments

* fix: cleareditor added to read only editor

* fix: image component double drop fixed

* feat: multiple image selection and uploading

* fix: click event on read only instance

* fix: made things async

* fix: prevented default behaviour

* fix: removed extra dep and cleaned up logic
2024-10-07 13:12:16 +05:30
guru_sainath
dfd3af13cf fix: handled favorite entity data null (#5756) 2024-10-07 12:57:15 +05:30
pablohashescobar
4cc1b79d81 chore: instance tracing 2024-10-04 21:35:13 +05:30
sriram veeraghanta
4a6f646317 fix: lockfile update 2024-10-04 19:38:19 +05:30
sriram veeraghanta
b8e21d92bf Merge branch 'preview' of github.com:makeplane/plane into preview 2024-10-04 19:26:06 +05:30
sriram veeraghanta
b87d5c5be6 fix: version upgrade 2024-10-04 19:25:49 +05:30
ach5948
ceda06e88d fix: Remove typo from Contributing doc (#5736) 2024-10-04 19:24:47 +05:30
sriram veeraghanta
eb344881c2 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-04 19:22:26 +05:30
Satish Gandham
01257a6936 chore: permission layer and updated issues v1 query from workspace to project level (#5753)
Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-10-04 18:34:46 +05:30
Prateek Shourya
51b01ebcac [WEB-2580] chore: improvements for custom search select. (#5744)
* [WEB-2580] chore: improvements for custom search select.

* chore: update optionTooltip prop.

* chore: update option tooltip prop.

* chore: minor updates.
2024-10-04 17:31:09 +05:30
sriram veeraghanta
0a8d66dcc3 fix: trace information setup 2024-10-04 16:40:33 +05:30
Akshita Goyal
ec22f1fc53 fix: cycles build issue (#5750) 2024-10-04 14:11:26 +05:30
sriram veeraghanta
a5e3e4fe7d fix: api tracing 2024-10-04 01:14:29 +05:30
Akshita Goyal
f1a0a8d925 Fix: Cycle graphs refactor (#5745)
* fix: community changes for cycle graphs

* fix: added dependency from root package.json
2024-10-03 19:25:53 +05:30
Mihir
ee0dce46de [WEB-2520] fix-Sorted Icon Not Updating Dynamically in Spreadsheet View (#5688)
* Updated conditional rendering of sorting icons

* Removed unused imports
2024-10-03 17:31:08 +05:30
Ketan Sharma
b7ee7e19fc [WEB-2213] fix: group by persistence for list view (#5590)
* fix kanban view localStorage

* add functionality for list view and add type for kanban function

* add comment in issue-filter-helper store

* improved code quality

* add comment for clarity

* use better variable names

* use useCallback hook and change variable name

* made suggested changes
2024-10-03 17:29:50 +05:30
rahulramesha
8291043704 Stop duplicate issue layout updates (#5743) 2024-10-03 16:51:18 +05:30
Akshat Jain
bc41b1113a add /live path in proxy pass (#5742) 2024-10-03 15:09:13 +05:30
Dancia
77d4a8379d Updated SECURITY.md (#5737)
* Updated SECUTITY.md

* Updated SECUTITY.md

* minor fix
2024-10-03 14:09:01 +05:30
Prateek Shourya
c90df623de fix: live base server url. (#5734)
* fix: live base server url.

* chore: update websocket URL logic.
2024-10-03 14:06:03 +05:30
Prateek Shourya
62c45f3bb1 [WEB-2559] fix: live server URL generation for self-managed instances. (#5733) 2024-10-01 21:03:17 +05:30
Prateek Shourya
96dc9db237 [WEB-2559] fix: web socket protocol. (#5731) 2024-10-01 19:57:17 +05:30
Akshita Goyal
5474ab326d fix: cycles import issues for ee (#5732) 2024-10-01 19:52:52 +05:30
Akshita Goyal
4940dc2193 Chore: progress chart changes (#5707)
* fix: progress chart code splitting

* fix: progress chart code splitting

* fix: build errors + review changes
2024-10-01 18:59:49 +05:30
Satish Gandham
632282d0df Fix build erorrs and unnecessary console.logs (#5730) 2024-10-01 15:31:04 +05:30
Satish Gandham
33f6c1fe9e [WEB-2001] feat: Fix local cache issues r4 (#5726)
* - Handle single quotes in load workspace queries
- Add IS null where condition in query utils

* Fix description_html being lost

* Change secondary order to sequence_id

* Fix update persistence layer

* Add instrumentation

* - Fallback to server incase of any error
2024-10-01 14:18:01 +05:30
Prateek Shourya
927d265209 [WEB-2573] improvement: search-issues API optimization. (#5727)
* limit search results to 100 issues.
2024-10-01 14:15:35 +05:30
M. Palanikannan
bfef0e89e0 [PE-46] fix: added aspect ratio to resizing (#5693)
* fix: added aspect ratio to resizing

* fix: image loading

* fix: image uploading and adding only necessary keys to listen to

* fix: image aspect ratio maintainance done

* fix: loading of images with uploads

* fix: custom image extension loading fixed

* fix: refactored all the upload logic

* fix: focus detection for editor fixed

* fix: drop images and inserting images cleaned up

* fix: cursor focus after image node insertion and multi drop/paste range error fix

* fix: image types fixed

* fix: remove old images' upload code and cleaning up the code

* fix: imports

* fix: this reference in the plugin

* fix: added file validation

* fix: added error handling while reading files

* fix: prevent old data to be updated in updateAttributes

* fix: props types for node and image block

* fix: remove unnecessary dependency

* fix: seperated display message logic from ui

* chore: added comments to better explain the loading states

* fix: added getPos to deps

* fix: remove click event on failed to load state

* fix: css for error and selected state
2024-09-30 19:43:14 +05:30
Prateek Shourya
e9d5db0093 [WEB-2568] chore: minor improvements for issue activity component. (#5725) 2024-09-30 19:23:24 +05:30
M. Palanikannan
bcd46b6aa9 fix: missing editor package (#5708) 2024-09-30 17:58:11 +05:30
Prateek Shourya
66ca1663bf [WEB-2579] fix: frequent loader on issue detail / archived issue detail page. (#5724)
* [WEB-2579] fix: frequent loader on issue detail / archived issue detail page.

* chore: minor improvement.
2024-09-30 17:32:08 +05:30
Akshat Jain
944f3417a1 chore: added live dev script (#5715)
* add live dev script

* fix: redis changes in .env .example
2024-09-30 17:03:29 +05:30
Ketan Sharma
193d530b40 [WEB-2550] fix: spacing by removing the right border (#5699)
* fix spacing by removing the right border

* remove log statement

* replicate the same for space
2024-09-30 16:17:57 +05:30
Aaryan Khandelwal
3b0f3ca761 chore: show content loader untile the server has synced (#5657) 2024-09-30 15:57:19 +05:30
Mihir
7f5a898cec [WEB-2266] chore-No favorites should be aligned like the rest of the things (#5618)
* Updated alignment of empty favorite text

* Updated padding
2024-09-30 15:49:30 +05:30
Mihir
bf6588b573 Updated notification text wrap (#5607) 2024-09-30 15:46:35 +05:30
Prateek Shourya
c25fa594fe [WEB-2568] chore: minor improvements related to issue identifier and issue modal. (#5723)
* [WEB-2568] chore: minor improvements related to issue identifier and issue modal.

* fix: error handling for session recorder script.

* chore: minor improvement
2024-09-30 14:07:22 +05:30
Prateek Shourya
b1dccf3773 chore: properties validation. (#5718) 2024-09-27 21:46:11 +05:30
Aaryan Khandelwal
04686d1721 fix: convert image size to string (#5717) 2024-09-27 20:39:50 +05:30
Satish Gandham
ec08fb078d [WEB-2001] feat: Fix local cache issues r3 (#5714)
* - Handle single quotes in load workspace queries
- Add IS null where condition in query utils

* Fix description_html being lost

* Change secondary order to sequence_id

* Fix update persistence layer

* Fix issue types filter
Fix none filter

* add local cache toggle in help section

* remove toggle from user settings

* Reset storage class on disabling local

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
2024-09-27 15:11:38 +05:30
Satish Gandham
8aa32d410c [WEB-2001] feat: Fix local cache issues v2 (#5712)
* - Handle single quotes in load workspace queries
- Add IS null where condition in query utils

* Fix description_html being lost

* Change secondary order to sequence_id

* Fix update persistence layer
2024-09-27 13:19:38 +05:30
Aaryan Khandelwal
ade03e9f8f chore: move headings list extension to the document editor (#5711) 2024-09-27 08:24:04 +05:30
Anmol Singh Bhatia
d253933995 [WEB-2552] fix: issue list overflow and event propagation (#5706) 2024-09-26 16:55:01 +05:30
Anmol Singh Bhatia
150af986fd fix: list layout item (#5704) 2024-09-26 14:11:48 +05:30
Satish Gandham
f3340749e8 [WEB-2001] fix: Issue local cache fixes (#5703)
* Fix sync of local updates

* Escape single quotes!!

* Fix last updated time query

* Move console.logs out

* Fix issue title not rendering line breaks when disabled

* Add a todo

* Fix build errors

* Disable local
2024-09-26 14:04:59 +05:30
rahulramesha
6e0ece496a fix peek overview loading state (#5698) 2024-09-26 13:29:34 +05:30
sriram veeraghanta
0068ea93de fix: rollup dependabot vulnerability fix 2024-09-25 19:35:26 +05:30
Prateek Shourya
6942e491d0 [WEB-2542] Fix: display filter and tooltip issues in list layout. (#5696)
* [WEB-2542] fix: list layout issues.
* fix: issue type display filter not working.
* fix: layout shift when hovered on bulkops checkbox.

* fix: build errors.

* fix: lint errors
2024-09-25 17:47:46 +05:30
Anmol Singh Bhatia
22623fad33 [WEB-2543] chore: workspace inbox guest permission (#5695)
* chore: workspace inbox permission updated

* chore: workspace inbox permission updated

* chore: code refactor

* chore: code refactor
2024-09-25 17:17:42 +05:30
Aaryan Khandelwal
85f7483b1b fix: update version history overlay z-index (#5694) 2024-09-25 14:11:21 +05:30
Anmol Singh Bhatia
fbb60941ef fix: issue quick action (#5692) 2024-09-25 13:50:44 +05:30
M. Palanikannan
20e569294d [WEB-2528] fix: side menu rendering even if created already (#5687)
* fix: side menu rendering even if created already

* fix: drag handles position
2024-09-24 20:11:49 +05:30
rahulramesha
117afdb67f add requestIdleCallback polyfill to fix Safari crash (#5689) 2024-09-24 19:37:12 +05:30
Satish Gandham
3df230393a [WEB-2001]feat: Cache issues on the client (#5327)
* use common getIssues from issue service instead of multiple different services for modules and cycles

* Use SQLite to store issues locally and load issues from it.

* Fix incorrect total count and filtering on assignees.

* enable parallel API calls

* use common getIssues from issue service instead of multiple different services for modules and cycles

* Use SQLite to store issues locally and load issues from it.

* Fix incorrect total count and filtering on assignees.

* enable parallel API calls

* chore: deleted issue list

* - Handle local mutations
- Implement getting the updates
- Use SWR to update/sync data

* Wait for sync to complete in get issues

* Fix build errors

* Fix build issue

* - Sync updates to local-db
- Fallback to server when the local data is loading
- Wait when the updates are being fetched

* Add issues in batches

* Disable skeleton loaders for first 10 issues

* Load issues in bulk

* working version of sql lite with grouped issues

* Use window queries for group by

* - Fix sort by date fields
- Fix the total count

* - Fix grouping by created by
- Fix order by and limit

* fix pagination

* Fix sorting on issue priority

* - Add secondary sort order
- Fix group by priority

* chore: added timestamp filter for deleted issues

* - Extract local DB into its own class
- Implement sorting by label names

* Implement subgroup by

* sub group by changes

* Refactor query constructor

* Insert or update issues instead of directly adding them.

* Segregated queries. Not working though!!

* - Get filtered issues and then group them.
- Cleanup code.
- Implement order by labels.

* Fix build issues

* Remove debuggers

* remove loaders while changing sorting or applying filters

* fix loader while clearing all filters

* Fix issue with project being synced twice

* Improve project sync

* Optimize the queries

* Make create dummy data more realistic

* dev: added total pages in the global paginator

* chore: updated total_paged count

* chore: added state_group in the issues pagination

* chore: removed deleted_at from the issue pagination payload

* chore: replaced state_group with state__group

* Integrate new getIssues API, and fix sync issues bug.

* Fix issue with SWR running twice in workspace wrapper

* Fix DB initialization called when opening project for the first time.

* Add all the tables required for sorting

* Exclude description from getIssues

* Add getIssue function.

* Add only selected fields to get query.

* Fix the count query

* Minor query optimization when no joins are required.

* fetch issue description from local db

* clear local db on signout

* Correct dummy data creation

* Fix sort by assignee

* sync to local changes

* chore: added archived issues in the deleted endpoint

* Sync deletes to local db.

* - Add missing indexes for tables used in sorting in spreadsheet layout.
- Add options table

* Make fallback optional in getOption

* Kanban column virtualization

* persist project sync readiness to sqlite and use that as the source of truth for the project issues to be ready

* fix build errors

* Fix calendar view

* fetch slimed down version of modules in project wrapper

* fetch toned down modules and then fetch complete modules

* Fix multi value order by in spread sheet layout

* Fix sort by

* Fix the query when ordering by multi field names

* Remove unused import

* Fix sort by multi value fields

* Format queries and fix order by

* fix order by for multi issue

* fix loaders for spreadsheet

* Fallback to manual order whn moving away from spreadsheet layout

* fix minor bug

* Move fix for order_by when switching from spreadsheet layout to translateQueryParams

* fix default rendering of kanban groups

* Fix none priority being saved as null

* Remove debugger statement

* Fix issue load

* chore: updated isue paginated query from  to

* Fix sub issues and start and target date filters

* Fix active and backlog filter

* Add default order by

* Update the Query param to match with backend.

* local sqlite db versioning

* When window is hidden, do not perform any db versioning

* fix error handling and fall back to server when database errors out

* Add ability to disable local db cache

* remove db version check from getIssues function

* change db version to number and remove workspaceInitPromise in storage.sqlite

* - Sync the entire workspace in the background
- Add get sub issue method with distribution

* Make changes to get issues for sync to match backend.

* chore: handled workspace and project in v2 paginted issues

* disable issue description and title until fetched from server

* sync issues post bulk operations

* fix server error

* fix front end build

* Remove full workspace sync

* - Remove the toast message on sync.
- Update the disable local message.

* Add Hardcoded constant to disable the local db caching

* fix lint errors

* Fix order by in grouping

* update yarn lock

* fix build

* fix plane-web imports

* address review comments

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-09-24 19:01:34 +05:30
Aaryan Khandelwal
8dabe839f3 fix: pass update image size (#5686) 2024-09-24 16:09:12 +05:30
Anmol Singh Bhatia
6b63e050ae [WEB-2525] fix: activity filters (#5682)
* fix: activity filters

* chore: code refactor
2024-09-24 16:08:28 +05:30
rahulramesha
6170a80757 [WEB-2001] chore: Code refactor for noload changes. (#5683)
* use common getIssues from issue service instead of multiple different services for modules and cycles

* add group by to server constants

* change issue detail's overview's is loading logic to the loader from the store

* add extra method in local storage

* Kanban render 10 issues by default per column

* fix height in group virtualization

* remove debounced code for Kanban fetching more issues per column

* fix lint errors
2024-09-24 14:27:57 +05:30
Aaryan Khandelwal
5ca794b648 chore: remove line-through decoration from checked todo list items (#5659) 2024-09-24 13:56:36 +05:30
Prateek Shourya
f38755b755 [WEB-2496] style: fix invite member input alignment on error state. (#5658) 2024-09-23 18:56:22 +05:30
Aaryan Khandelwal
2153eda9a8 fix: editor container height (#5669) 2024-09-23 18:49:53 +05:30
sriram veeraghanta
83bfca6f2d fix: linting issues and rule changes (#5681)
* fix: lint config package updates

* fix: tsconfig changes

* fix: lint config setup

* fix: lint errors and adding new rules

* fix: lint errors

* fix: ui and editor lints

* fix: build error

* fix: editor tsconfig

* fix: lint errors

* fix: types fixes

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-09-23 17:10:38 +05:30
Aaryan Khandelwal
e143e0a051 chore: add server name while server initialization (#5656) 2024-09-23 16:44:50 +05:30
Mihir
50af7c5bf6 Updated the empty state button text for analytics (#5678) 2024-09-23 16:44:11 +05:30
Aaryan Khandelwal
846398df41 fix: casing across all settings pages (#5675) 2024-09-23 16:41:25 +05:30
Aaryan Khandelwal
0853a2790f style: updated create workspace item text color (#5674) 2024-09-23 16:41:04 +05:30
Mihir
ed39f2dc37 [WEB-2390] fix: Clickable Area for Issue List Layout Item (#5536)
* Updated control block to cover the whole element

* Updated the control link to cover the whole issues and relation blocks

* updated word wrap in notifications

* Reverted break words as its a different issue.
2024-09-23 16:36:58 +05:30
Bavisetti Narayan
45fded9842 chore: issue relation hard delete (#5671) 2024-09-23 16:33:39 +05:30
Mihir
76a34440c3 Updated icons to mutate (#5670) 2024-09-23 16:26:47 +05:30
Ketan Sharma
4d200ff0a3 [WEB-2427] fix: white background behind emoji (#5624)
* adding translucent background

* make icon rounded
2024-09-23 16:24:51 +05:30
Ketan Sharma
f49a2aa9e3 [WEB-2511] fix: fix overlapping issues for headers globally (#5667)
* fixed only for spreadsheet

* change package for global change

* made global and ad hoc changes

* fix border and z-index for intake and notifications header
2024-09-23 16:03:56 +05:30
Aaryan Khandelwal
83b83326c5 [WEB-2509] feat: fullscreen option for editor images (#5665)
* feat: editor image full screen mode

* fix: full screen modal visibility

* refactor: memoize calculations

* chore: update useEffect dependencies
2024-09-23 16:00:06 +05:30
Anmol Singh Bhatia
3c1779b287 fix: workspace setting validation (#5654) 2024-09-23 15:56:36 +05:30
Aaryan Khandelwal
22b32fd5c6 [WEB-2497] chore: update pages' offline badge tooltip content (#5652)
* chore: update offline badge tooltip content

* chore: revert yarn lock changes
2024-09-23 15:52:32 +05:30
rahulramesha
c4c2d81d24 fix build (#5679) 2024-09-23 15:40:34 +05:30
Aaryan Khandelwal
f9a8896486 [WEB-1116] chore: add fallback for the live server (#5622)
* chore: add fallback for the live server

* fix: update provider document after patch request

* chore: make the health check call only on connection fail

* chore: update debounce interval

* refactor: remove useSwr call for healtch check

* fix: pages fallback init
2024-09-23 15:35:06 +05:30
rahulramesha
ae1a63f832 [WEB-2518] chore: Reverse order by of priority keys (#5591)
* make front end changes for priority orderby reversal

* chore: handled priority ordering in issues pagination

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-09-23 14:58:05 +05:30
M. Palanikannan
a05876552c [WEB-1116] fix: page outline not reflecting changes in realtime (#5567)
* fix: svg not supported in image uploads

* fix: svg image file error message fixed

* fix: heading not updating with realtime

* chore: add read-only editor support

* fix: headings show on initial render

* fix: types and imports

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-09-23 14:44:27 +05:30
rahulramesha
b6e813cb9a fix animation performance on kanban group virtualization (#5666) 2024-09-23 12:48:44 +05:30
rahulramesha
f328772b82 fix large dropdown properties truncation (#5672) 2024-09-22 01:42:16 +05:30
rahulramesha
604ddad3fa [WEB-2453] fix: Render on hover only when enabled (#5609) 2024-09-20 20:26:38 +05:30
rahulramesha
66cfc7344e change kanban group virtualization logic (#5664) 2024-09-20 14:39:28 +05:30
Aaryan Khandelwal
a4933b5614 chore: remove modal for creating a page (#5561) 2024-09-19 20:26:11 +05:30
Ketan Sharma
e70e27296b changes for web-2425 (#5616) 2024-09-19 20:15:10 +05:30
Prateek Shourya
361ef9236e [WEB-1970] fix: onboarding invitation page fluctuation on refresh. (#5627) 2024-09-19 17:51:22 +05:30
Ketan Sharma
450bb42c46 [WEB-2330] fix: you don't have permission toast on on bulk delete (#5599)
* fix logic check boolean then call function

* minor code improvement

* fixed logic error
2024-09-19 17:49:30 +05:30
Aaryan Khandelwal
77152b3119 style: remove side menu position transition (#5637) 2024-09-19 17:47:34 +05:30
Ketan Sharma
e9464f9e68 [WEB-2475] fix: applied filters header z-index and transparency (#5632)
* fixed only for spreadsheet

* change package for global change
2024-09-19 17:36:52 +05:30
rahulramesha
c8c9638e5a fix render-if-visible-hoc's style calculation performance issue (#5647) 2024-09-19 10:02:46 +05:30
Akshita Goyal
bd0ca0cded fix: archive page break issue resolved (#5644) 2024-09-18 20:08:27 +05:30
Anmol Singh Bhatia
96781dbb0f fix: workspace view applied filters (#5651) 2024-09-18 20:07:01 +05:30
Bavisetti Narayan
19132d15b8 chore: pick first inbox issue (#5650) 2024-09-18 19:10:36 +05:30
sriram veeraghanta
6befc6e564 fix: upgrading nextjs package 2024-09-18 18:56:38 +05:30
Aaryan Khandelwal
441e5fc054 chore: update page lock authorization (#5635) 2024-09-18 18:21:05 +05:30
Aaryan Khandelwal
43633f2f28 fix: issue description value (#5636) 2024-09-18 18:20:43 +05:30
Anmol Singh Bhatia
3a9f01b9eb [WEB-2462] [WEB-2461] fix: project intake filters (#5645)
* chore: intake order by options updated

* fix: intake filters icon and spacing

* chore: code refactor
2024-09-18 18:10:30 +05:30
rahulramesha
5e83da9ca1 [WEB-2316] chore: Kanban group virtualization (#5565)
* kanban group virtualization

* minor name change
2024-09-18 18:03:49 +05:30
Akshita Goyal
aec4162c22 fix: webhook modal spacing (#5641) 2024-09-18 15:35:46 +05:30
Anmol Singh Bhatia
44542fdd6b fix: list layout quick action styling (#5639) 2024-09-18 15:33:20 +05:30
Anmol Singh Bhatia
5ad6e99327 fix: project settings layout (#5638) 2024-09-18 15:01:35 +05:30
Bavisetti Narayan
30018d64a2 chore: restrict member to see private projects (#5640) 2024-09-18 14:54:35 +05:30
Prateek Shourya
1c0c1586cb [WEB-2308] fix: descritpion editor loader on issue modal when edition a sub issue from another project. (#5625) 2024-09-18 13:38:01 +05:30
Prateek Shourya
524033411e [WEB-2250] fix: filter projects with create permission while selecting the project in create issue modal. (#5630) 2024-09-18 13:32:24 +05:30
Prateek Shourya
3b40158d9a [WEB-2395] chore: minor UX copy update for what's new link. (#5626)
* [WEB-2395] chore: minor ux copy update for what's new link.

* fix: import errors.
2024-09-18 13:22:51 +05:30
Bavisetti Narayan
4d9115d51e chore: inbox rename (#5628) 2024-09-18 13:18:45 +05:30
M. Palanikannan
146a500f9f [WEB-2450] fix: image resize component (#5623)
* fix: image resize fixed for initial render

* fix: working image resize with mousemove handler only inside the editor

* fix: unnecessary calc

* fix: setting state to true
2024-09-17 16:54:42 +05:30
Anmol Singh Bhatia
7d7415b235 [WEB-2467] fix: platform bug (#5621)
* fix: reaction endpoint

* fix: project label edit permission

* fix: guest role upgrade

* fix: list layout dnd permission

* fix: module and cycle toast alert

* fix: leave project redirection
2024-09-17 16:43:51 +05:30
Akshita Goyal
7aea820cfa [WEB-2459] Fix: analytics scroll + dashboard stat minor padding (#5613)
* fix: analytics scroll + dashboard stat minor padding

* fix: build issue
2024-09-17 16:33:34 +05:30
sriram veeraghanta
69b4f155fc fix: yjs dependencies revert 2024-09-16 21:06:40 +05:30
sriram veeraghanta
8f492e4c6c fix: disable turbo telemetry on live service 2024-09-16 20:58:35 +05:30
M. Palanikannan
8533eba07d [WEB-2450] dev: custom image extension (#5585)
* fix: svg not supported in image uploads

* fix: svg image file error message fixed

* feat: add custom image node for uploads

* fix: combine two extensions

* fix: added new image extension to backend

* fix: type errors

* style: image drop node

* style: image resize handler

* fix: removed unused stuff

* fix: types of updateAttributes

* fix: image insertion at pos and loading effect added

* fix: resize image real time sync

* fix: drag drop menu

* feat: custom image component editor

* fix: reverted back styles

* fix: reverted back document info changes

* fix: css image css

* style: image selected and hover states

* refactor: custom image extension folder structure

* style: read-only image

* chore: remove file handler

* fix: fixed multi time file opener

* fix: editor readonly content set properly

* fix: old images not rendered as new ones

* fix: drop upload fixed

* chore: remove console logs

* fix: src of image node as dependency

* fix: helper library build fix

* fix: improved reflow/layout and fixed resizing

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-09-16 19:36:20 +05:30
Anmol Singh Bhatia
edf0ab8175 fix: build error (#5617) 2024-09-16 19:22:47 +05:30
Anmol Singh Bhatia
45da70cf6a [WEB-2460] fix: role permission validation (#5615)
* fix: workspace menu quick action

* fix: guest role upgrade flow validation

* fix: create issue validation

* fix: create issue validation

* fix: cmd k permission validation

* fix: subscription validation

* fix: create label permission validation

* fix: build error

* chore: guest can comment in their created issues

* chore: changed the queryset

* chore: code refactor

* chore: code refactor

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-09-16 18:56:28 +05:30
Prateek Shourya
2e816656e5 [WEB-2112 | WEB-2113] dev: billing and change-log improvements. (#5614)
* chore: minor improvements in billing and changelogs.

* fix: lint errors.

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-09-16 18:36:17 +05:30
Aaryan Khandelwal
6826ce0465 [WEB-1116] chore: remove yjs packages from the editor (#5603)
* chore: remove yjs packages from the editor

* chore: updated yarn lock file
2024-09-16 18:28:09 +05:30
sriram veeraghanta
c4b5c737f3 fix: adding types in package 2024-09-16 17:54:23 +05:30
sriram veeraghanta
89a1c0b534 fix: build errors 2024-09-16 17:48:10 +05:30
Akshita Goyal
74507559b8 [WEB-2456] Chore: workspace member list additional info (#5604)
* chore: added last login medium

* chore: added email and authentication columns in member settings

* fix: revoked lock file changes

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-09-16 14:28:41 +05:30
Bavisetti Narayan
3ce84f78f1 chore: roles demotion (#5612) 2024-09-16 14:25:27 +05:30
Anmol Singh Bhatia
5ba1eeaf4c [WEB-2443] fix: join project flicker (#5602)
* fix: join project flicker

* fix: leave project project mutation and code refactor
2024-09-16 14:16:23 +05:30
Anmol Singh Bhatia
c14d20c2e0 fix: workspace settings access validation updated (#5606) 2024-09-16 14:03:06 +05:30
sriram veeraghanta
f155a13929 fix: adding new session cookie name 2024-09-13 16:59:47 +05:30
Anmol Singh Bhatia
485caaf2ec [WEB-2443] fix: project member validation (#5601)
* fix: project member validation

* fix: project member validation
2024-09-13 16:28:03 +05:30
Ketan Sharma
b44dd28ac0 [WEB-2445] fix: date picker and member picker dropdown z-index for list, kanban and spreadsheet views (#5597)
* changes for list and kanban

* passing values for list and kanban

* spreadsheet changes

* fix use different props for different stylings

* fix z index
2024-09-13 12:03:00 +05:30
sriram veeraghanta
1b0e31027e fix: lint fixes and typescript version fixes 2024-09-12 20:39:31 +05:30
Anmol Singh Bhatia
1efb067274 fix: build error (#5598) 2024-09-12 20:22:50 +05:30
Prateek Shourya
b2533b94ce [WEB-2444] improvement: performance improvement for useOutsideClickDetector and usePeekOverviewOutsideClickDetector. (#5595)
* [WEB-2444] improvement: performace improvement for `useOutsideClickDetector` and `usePeekOverviewOutsideClickDetector`.

* Move outside click detector to plane helpers package.

* chore: remove plane helpers yarn.lock
2024-09-12 20:10:04 +05:30
Anmol Singh Bhatia
441385fc95 [WEB-2443] fix: role validation and code refactor (#5596)
* chore: delete cycle toast message updated

* fix: view page empty state

* fix: project settings automation

* fix: intake delete action

* fix: project label validation

* fix: project label validation

* fix: project state permission updated

* chore: code refactor
2024-09-12 20:08:13 +05:30
sriram veeraghanta
5f1939cdeb fix: workflow sync fixes (#5594) 2024-09-12 17:22:41 +05:30
Anmol Singh Bhatia
9d694ab006 fix: not authorized flicker (#5593) 2024-09-12 16:26:57 +05:30
Anmol Singh Bhatia
48e97477ed fix: issue properties dropdown (#5592) 2024-09-12 16:02:56 +05:30
Anmol Singh Bhatia
33dd5fe8cc [WEB-2443] fix: project intake edit permission (#5588)
* fix: project intake edit permission

* chore: inbox issue validation changes

* fix: intake edit permission updated

* fix: project invite modal

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-09-12 14:44:21 +05:30
Anmol Singh Bhatia
aed2f2dd47 fix: page permission validation (#5589) 2024-09-12 14:39:38 +05:30
Ketan Sharma
eb84f165f4 [WEB-2282] fix: date picker and member picker dropdown z-index for list, kanban and spreadsheet views (#5555)
* changes for list and kanban

* passing values for list and kanban

* spreadsheet changes
2024-09-12 14:35:45 +05:30
Mihir
572644f7f9 Updated alignment inside kanban header (#5559) 2024-09-12 14:34:24 +05:30
Aaryan Khandelwal
ddbd9dfdc8 chore: add toast alerts post access change of a page (#5569) 2024-09-12 14:32:54 +05:30
Mihir
09578c9a7d Updates theme options to include custom theme option (#5574) 2024-09-12 14:32:14 +05:30
Mihir
e5ddfd322d [WEB-2393] chore: removal of .svg from supported image formats (#5582)
* Updated supported image formats

* Updated image accepting functions
2024-09-12 14:25:06 +05:30
Anmol Singh Bhatia
87d6544b72 fix: project favorite permission validation (#5587) 2024-09-12 14:09:19 +05:30
Bavisetti Narayan
fdcd9a376c [WEB-2357] fix: update and redefine user roles across the platform (#5466)
* chore: removed viewer role

* chore: indentation

* chore: remove viewer role

* chore: handled user permissions in store

* chore: updated the migration file

* chore: updated user permissions store

* chore: removed the owner key

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor

* fix: build error

* chore: updated user permissions store and handled the permissions fetch in workspace and project wrappers

* chore: package user enum updated

* chore: user permission updated

* chore: user permission updated

* chore: resolved build errors

* chore: resolved build error

* chore: resolved build errors

* chore: computedFn deep map issue resolved

* chore: added back migration

* chore: added new field in project table

* chore: removed member store in users

* chore: private project for admins

* chore: workspace notification access validation updated

* fix: workspace member edit option

* fix: project intake permission validation updated

* chore: workspace export settings permission updated

* chore: guest_view_all_issues added

* chore: guest_view_all_issues added

* chore: key changed for guest access

* chore: added validation for individual issues

* chore: changed the dashboard issues count

* chore: added new yarn file

* chore: modified yarn file

* chore: project page permission updated

* chore: project page permission updated

* chore: member setting ux updated

* chore: build error

* fix: yarn lock

* fix: build error

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2024-09-11 17:10:15 +05:30
Bavisetti Narayan
7013a36629 [WEB-2430] fix: issue exports for project (#5579)
* fix: issue exports for project

* chore: code cleanup
2024-09-11 13:18:59 +05:30
Anmol Singh Bhatia
bb49d27a84 fix: join project permission mutation (#5580) 2024-09-11 12:37:31 +05:30
Prateek Shourya
00b76300f5 [WEB-2421] chore: issue display properties and issue identifier improvements. (#5577)
* [WEB-2421] chore: issue display properties and issue identifier improvements.

* chore: remove yarn.lock changes.
2024-09-10 21:49:57 +05:30
sriram veeraghanta
71f3c5c12a fix: typescript upgrade build errors 2024-09-10 21:31:32 +05:30
sriram veeraghanta
99ab274216 fix: upgrading the python runtime version 2024-09-10 20:44:38 +05:30
sriram veeraghanta
04b10cabc8 fix: tailwind warning fixes 2024-09-10 17:57:06 +05:30
sriram veeraghanta
545717cc51 fix: security update for express pacakge 2024-09-10 17:36:32 +05:30
sriram veeraghanta
1ca0a15792 fix: upgrading tubro version 2024-09-10 17:31:10 +05:30
sriram veeraghanta
c5971f03aa Merge branch 'preview' of github.com:makeplane/plane into preview 2024-09-10 17:29:34 +05:30
sriram veeraghanta
902403a54d chore: linting warning resolved 2024-09-10 17:29:19 +05:30
Akshat Jain
1d6ebb7c41 add the SERVICE_FOLDER value to install.sh script dynamically (#5553) 2024-09-10 17:29:16 +05:30
Rounak Shrestha
106914e14e fix: Local Setup on Windows (#5539) 2024-09-10 17:28:18 +05:30
Aaryan Khandelwal
8acb60baef [WEB-1116] fix: current version not displaying the latest content (#5573)
* fix: current version sync

* chore: update read only editor ref type
2024-09-10 16:13:20 +05:30
Manish Gupta
1da97d5814 skipped stable tag for prerelrease, modified docker tag for branch name with special characters (#5570) 2024-09-10 15:10:10 +05:30
Goran
5fb2dd0b6e fix(webhook): allow private ip to be used as payload url (#5535)
Co-authored-by: gmajkic <gmajkic@veepee.com>
2024-09-10 14:57:30 +05:30
Akshita Goyal
ff6c3ce1a0 fix: settings page scrollbar (#5572) 2024-09-10 14:44:32 +05:30
Anmol Singh Bhatia
ec51e9d8ce fix-header-theme (#5564) 2024-09-10 14:42:24 +05:30
Aaryan Khandelwal
cc07992e47 [WEB-2424] fix: add optional chaining for parent node (#5571)
* fix: add optional chaining for parent node

* chore: revert yarn lock changes
2024-09-10 14:41:48 +05:30
M. Palanikannan
069f8b950e fix: svg not supported in image uploads in the editor (#5558)
* fix: svg not supported in image uploads

* fix: svg image file error message fixed
2024-09-10 14:27:27 +05:30
Akshita Goyal
5eb868e07d [WEB 2418] Fix minor UI inconsistencies (#5568)
* fix: project features modal padding

* fix: minor ui inconsistencies

* fix: lint issue
2024-09-10 14:24:07 +05:30
Aaryan Khandelwal
7c77fc1680 fix: task list not getting synced (#5566) 2024-09-09 21:35:31 +05:30
Anmol Singh Bhatia
99a7867a5e [WEB-2228] fix: dashboard peek overview issue stats #5442 (#5560)
* fix: dashboard issue stats

* chore: code refactor
2024-09-09 20:37:46 +05:30
Ketan Sharma
c44bf861e0 [WEB-2415] fix:remove input type to fix image upload (#5563)
* remove input type to fix things

* made the same changes in all locations
2024-09-09 20:12:15 +05:30
M. Palanikannan
4d38a10f8b fix: character count to work properly on editor rerenders and read only mode (#5554)
* fix: character count to work properly on editor rerenders and read only mode

* fix: desctructing properly at the start
2024-09-09 19:59:07 +05:30
Akshita Goyal
7c3fc690e9 fix: project features modal padding (#5562) 2024-09-09 19:22:47 +05:30
Prateek Shourya
8cf1c2d136 [WEB-2413] chore: admin application restructuring. (#5557) 2024-09-09 17:43:56 +05:30
Ketan Sharma
fe280b2beb [WEB-2106] fix: add date and state change functionalities to list and grid view (#5533)
* added functionality to list and grid

* fixed logic for archived module

* fixed logic for list view

* improved logic and fixed linting issues

* improved variable names
2024-09-09 16:50:56 +05:30
Ketan Sharma
ad5c6ee4f5 [WEB-2201] fix: clear email button on login screen (#5546)
* fixed the logic

* made required css changes

* replicated same for space component

* fixed variable name

* replicated for space

* better variable name

* improved the css

* replicated for space
2024-09-09 14:58:06 +05:30
Mihir
ba0d1ba518 Update sidebar (#5549)
Removed else statement which was expanding it whenever windowSize changed or webapp was hard refreshed.
2024-09-09 14:57:05 +05:30
M. Palanikannan
70ea1459cd fix: async loading of the redis extension (#5537)
* fix: async loading of the redis extension

* fix: initialize redis connection and hocuspocusserver only during server start

* fix: removed console logs

* fix: remove async

* fix: error handling and shutting down gracefully in unhandled errors

* feat: added compression library

* fix: added helmet for security headers
2024-09-07 14:24:20 +05:30
Aaryan Khandelwal
8154a190d2 [WEB-1116] fix: editor info badges occupying multiple lines (#5548) 2024-09-07 09:01:01 +05:30
Ketan Sharma
29fd1186ee [WEB-2129] fix: module creation and updation toast error (#5550)
* chore: added error message for module name

* used the backend message

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-09-07 08:58:28 +05:30
Aaryan Khandelwal
68b412badf [WEB-1933] refactor: link create/update for issues and modules (#5543)
* chore: added module and issue link validation

* refactor: issues and modules link moda;

* chore: changed the url validation logic

* chore: code cleanup

* refactor: modules link logic

* chore: removed the validator function

* fix: url validation regex

* chore: removed unwanted imports

* chore: reverted the external api changes

* refactor: link modals

* refactor: reset modal logic

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-09-06 22:52:29 +05:30
Akshita Goyal
c95aa6a0f7 [WEB-2273] Fix: page alignments (#5541)
* chore: headers + common containers

* fix: filters code splitting

* fix: home header

* fix: header changes

* chore: page alignments fixed

* fix: uncommented filters

* fix: used enums

* fix: cards + filters

* fix: enum changes

* fix: reverted package changes

* fix: reverted package changes

* fix: Card + tags seperated + naming fixed

* fix: card + tags seperated + naming fixed

* fix: mobile headers fixed partially

* fix: build errors + minor css

* fix: checkbox spacing

* fix: review changes

* fix: lint errors

* fix: minor review changes

* fix: header-alignments

* fix: tabs

* fix: settings page

* fix: subgroup page

* fix: mobile headers

* fix: settings mobile header made observable

* fix: lint error + edge case handling
2024-09-06 18:38:53 +05:30
rahulramesha
751cd6c862 [WEB-2365] fix: Minor UI in-consistencies cause by tooltip changes (#5545)
* Fix minor in-consistencies caused by tooltip on hover changes

* fix linting
2024-09-06 18:37:57 +05:30
Prateek Shourya
1032bc75d7 [WEB-2332] chore: layout structure improvement. (#5538)
* [WEB-2332] chore: layout structure improvement.

* chore: improve layout.
2024-09-06 16:46:42 +05:30
Ketan Sharma
9415a5ba00 made required changes in css (#5542) 2024-09-06 16:22:59 +05:30
Akshat Jain
d24a4e18a2 add: API_BASE_URL env to selfhost envs (#5523)
* add: API_BASE_URL env to selfhost envs

* Update variables.env
2024-09-06 16:22:16 +05:30
Anmol Singh Bhatia
52f78a86af [PWA-26] chore: pwa input focus improvement (#5507)
* chore: pwa dropdown input focus improvement

* chore: tab indices helper function updated and code refactor

* chore: modal tab index refactoring

* fix: PWA filters input autofocus

* chore: intake tab index updated and code refactor

* chore: code refactor
2024-09-06 16:21:14 +05:30
Anmol Singh Bhatia
c84c37805c [PWA-22] chore: pwa issue redirection (#5544)
* chore: issue peek overview redirection hook added

* chore: handleIssuePeekOverview function updated
2024-09-06 15:36:06 +05:30
Anmol Singh Bhatia
c2758caf95 chore: pwa issue detail improvement (#5540) 2024-09-06 15:23:48 +05:30
M. Palanikannan
73654a25c4 fix: redis connection instantiated out (#5534) 2024-09-05 20:18:26 +05:30
M. Palanikannan
e1380f52ec fix: add the redis extension conditionally (#5524)
* fix: add the redis extension conditionally

* chore: import order and stuff

* fix: added logger, error handling and routing

* feat: configured sentry with source maps

* fix: sentry config and returning json

* fix: remove on change logs

* fix: add pretty print
2024-09-05 18:15:46 +05:30
Anmol Singh Bhatia
406ffcd7de [WEB-2358] fix: recent collaborators (#5532)
* fix: recent collaborators

* fix: recent collaborators loader
2024-09-05 18:09:10 +05:30
Bavisetti Narayan
d265635f7e chore: workspace active page filter (#5531) 2024-09-05 15:38:45 +05:30
Bavisetti Narayan
3d7098855f [WEB-2358] chore: optimised the recent collaborators endpoint (#5470)
* chore: optimised the recent collaborators endpoint

* chore: recent collabators code refactor

* chore: sorted the user's based on active issues

* chore: recent collaborators sorting

* chore: code refactor

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2024-09-05 15:38:10 +05:30
rahulramesha
bf49ebb519 Add missing Mobx observers to components (#5530) 2024-09-05 15:34:08 +05:30
Bavisetti Narayan
4c8e8d985c fix: now parent can be expanded in external api (#5511) 2024-09-05 13:32:03 +05:30
Bavisetti Narayan
a3a7053be7 chore: added identifiers in the notification (#5513) 2024-09-05 13:30:44 +05:30
Aaryan Khandelwal
dbecf5cf5e chore: add favorites option inside a page (#5512) 2024-09-05 13:18:11 +05:30
Aaryan Khandelwal
bd20d71fc4 chore: add extra check to the version editor (#5521) 2024-09-05 12:38:50 +05:30
Aaryan Khandelwal
b80049d533 fix: untitle page title in favorites list (#5515) 2024-09-05 12:37:15 +05:30
Akshita Goyal
87dbb9b888 [WEB-2273] Chore: page alignments (#5505)
* chore: headers + common containers

* fix: filters code splitting

* fix: home header

* fix: header changes

* chore: page alignments fixed

* fix: uncommented filters

* fix: used enums

* fix: cards + filters

* fix: enum changes

* fix: reverted package changes

* fix: reverted package changes

* fix: Card + tags seperated + naming fixed

* fix: card + tags seperated + naming fixed

* fix: mobile headers fixed partially

* fix: build errors + minor css

* fix: checkbox spacing

* fix: review changes

* fix: lint errors

* fix: minor review changes
2024-09-05 12:16:24 +05:30
Prateek Shourya
c78b2344b8 [WEB-2376] dev: workspace settings improvement & refactor. (#5519)
* [WEB-2376] dev: workspace settings improvement & refactor.

* chore: update `filterWorkspaceSettingLinks` to `shouldRenderSettingLink`.
2024-09-04 20:21:16 +05:30
Anmol Singh Bhatia
eea6ceaec4 fix: pwa intake issue comment section z-index (#5522) 2024-09-04 20:15:46 +05:30
Mihir
7750844fc3 [WEB-2216] fix: added validation check for white space for create issue modal (#5468)
* Updated validation check for issue modal

* Updates to functions for throwing errors

* Updates to functions for throwing errors
2024-09-04 20:15:14 +05:30
Aaryan Khandelwal
f0da532db7 fix: remove esm build for the ui package (#5517) 2024-09-04 18:12:31 +05:30
dependabot[bot]
5180daae87 chore(deps): bump cryptography in /apiserver/requirements (#5520)
Bumps [cryptography](https://github.com/pyca/cryptography) from 42.0.5 to 43.0.1.
- [Changelog](https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pyca/cryptography/compare/42.0.5...43.0.1)

---
updated-dependencies:
- dependency-name: cryptography
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-09-04 15:17:25 +04:00
M. Palanikannan
9f12d13dea fix: initialize redis client and pass it to hocuspocus (#5516)
* fix: initialize redis client and pass it to hocuspocus

* chore: renamed func

* fix: yarn lock
2024-09-04 16:35:01 +05:30
Prateek Shourya
20b1558dd7 [WEB-2332] fix: application layout and minor UI improvements. (#5514)
* [WEB-2332] fix: application layout and minor UI improvements.

* [WEB-2332] fix: revert back layout changes.

* fix: lint error.

* fix: lint errors.
2024-09-04 16:09:55 +05:30
Akshita Goyal
22656d0114 [WEB-2273] Chore: header UI (#5467)
* chore: headers + common containers

* fix: filters code splitting

* fix: home header

* fix: header changes

* fix: uncommented filters

* fix: used enums

* fix: enum changes
2024-09-04 14:38:30 +05:30
Aaryan Khandelwal
747905a96d refactor: utility handlers (#5510) 2024-09-03 18:36:31 +05:30
Ketan Sharma
b6d596b474 replaced necessary .svg files with .webp and made edits the imports in the file (#5474) 2024-09-03 18:31:01 +05:30
Dima Hinev
a36d4480bd chore: search on enter for image picker popover unsplash input (#5499) 2024-09-03 18:29:48 +05:30
rahulramesha
3fbfe94f5f add issue_type to filters from when loading from persisted data (#5509) 2024-09-03 17:59:43 +05:30
M. Palanikannan
1cd7259852 fix: parse redis url to get hostname and port (#5502)
* fix: parse redis url to get hostname and port

* fix: redis url accepted for connection

* chore: add redis url to example env

* fix: let users add redis port and host incase redis url is not present

* chore: create url from host and port variables

* fix: return empty string incase of no config
2024-09-03 17:29:03 +05:30
Aaryan Khandelwal
5840b40d96 [WEB-1116] chore: live server code splitting (#5508)
* chore: live server code splitting

* chore: update import paths

* chore: update bebel path alias

* fix: document types type

* chore: updated error messages
2024-09-03 17:03:50 +05:30
Ketan Sharma
1ef535af7b [WEB-2254] fix: change message for issue via link empty state (#5492)
* change empty state message for issues opened via link

* remove log statement
2024-09-03 15:56:38 +05:30
rahulramesha
fd3e3d1a19 fix dev build for plane ui (#5506) 2024-09-03 15:44:00 +05:30
Aaryan Khandelwal
9910ed6e5f [WEB-1116] refactor: page helpers for document transformation (#5503)
* refactor: page helpers for document transformation

* refactor: update tranforamtion function name
2024-09-03 15:31:32 +05:30
Aaryan Khandelwal
539acd58f7 chore: update live server env example file (#5496) 2024-09-03 13:00:08 +05:30
Prateek Shourya
a11c12cd7b [ENG-37] chore: sidebar help section revamp. (#5495)
* [ENG-37] chore: sidebar help section revamp.

* fix: lint error.
2024-09-02 21:29:09 +05:30
Anmol Singh Bhatia
e9f486eec6 fix: completed cycle issue transfer validation (#5494) 2024-09-02 18:01:37 +05:30
Aaryan Khandelwal
6c3a8a9647 [WEB-1116] feat: pages realtime collaboration (#5493)
* [WEB-1116] feat: pages realtime sync (#5057)

* init: live server for editor realtime sync

* chore: authentication added

* chore: updated logic to convert html to binary for old pages

* chore: added description json on page update

* chore: made all functions generic

* chore: save description in json and html formats

* refactor: document editor components

* chore: uncomment ui package components

* fix: without props extensions refactor

* fix: merge conflicts resolved from preview

* chore: init docker compose

* chore: pages custom error codes

* chore: add health check endpoint to the live server

* chore: update without props extensions type

* chore: better error handling

* chore: update react-hook-form versions

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>

* fix: docker related fixes

* fix: module type fixes

* fix: nginx update

* fix: adding live server workflow

* fix: workflow fixes

* fix: docker compose fixes

* fix: workflow fixes

* fix: path config

* fix: docker compose warnings

* fix: nginx port forwarding

* fix: update docker compose with new env

* fix: env var fixes

* fix: error handling

* fix: docker compose env var

* fix: compose fixes

* chore: update server start message

* chore: handle errors

* fix: build errors

* chore: update port

* chore: update server port

* chore: show error on authentication fail

* chore: show error on authentication fail

* feat: add redis extension

* chore: updated restore version logic

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: Palanikannan M <akashmalinimurugu@gmail.com>
2024-09-02 17:54:12 +05:30
Akshat Jain
2c950713a7 Add RabbitMQ Service to Docker Compose Configuration (#5439)
* fix: celery broker setup

* fix: docker compose update

* fixed rabbitmq vhost issue

* fix: env fixes

* fix-envs-issue in selfhost docker compose

* volume name fix

* added depends on for rabbitmq service

* Add: AMQP_URL for remote rabbitmq urls

* added amqp url im docker compose

* changed default user to guest

* fix: changes the Rabbit mq password var name

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-09-02 17:40:17 +05:30
Prateek Shourya
8526b801f4 fix: cycle analytics reponse. (#5480) 2024-09-02 15:12:32 +05:30
rahulramesha
10c253471c [WEB-2365] fix: Misaligned tooltips in few components (#5486)
* fix mis-aligned tooltips in few components

* fix tooltip for kanban title
2024-09-02 15:09:55 +05:30
Dima Hinev
65b9cfbfe2 fix: improve identation for workspace menu (#5487) (#5489) 2024-09-02 15:08:40 +05:30
Anmol Singh Bhatia
12a304b04f [WEB-2228] fix: dashboard peek overview issue fetch (#5442)
* fix: dashboard peekoverview issue fetch

* fix: intake issue modal remove parent issue action
2024-09-02 14:01:57 +05:30
Aaryan Khandelwal
bac5b53ffb [WEB-2348] fix: allow updating comments with just mentions in them (#5471)
* fix: accept mentions while updating comments

* chore: remove console log

* chore: update empty string helper function
2024-09-02 14:00:41 +05:30
Aaryan Khandelwal
03c28a11e8 fix: highlight current user on read only lite text editor (#5472) 2024-09-02 13:58:55 +05:30
rahulramesha
bcd08b3159 [WEB-2363] fix: Error while updating issue in cycles (#5478)
* fix update parent stats error

* fix web lint
2024-09-02 13:58:36 +05:30
Bavisetti Narayan
599092d76b chore: added issue webhook (#5463) 2024-08-30 20:26:43 +05:30
Bavisetti Narayan
1d2e7d3fd8 [WEB-2359] chore: resolved the bugs reported in sentry (#5447)
* chore: resolved the bugs reported in sentry

* chore: html content none type validation

* chore: changed the webhook key name
2024-08-30 20:26:09 +05:30
Ketan Sharma
9d9a812f7b changed the old message to the new one (#5475) 2024-08-30 19:58:39 +05:30
Anmol Singh Bhatia
b9f78ba42b chore: next image config updated (#5452) 2024-08-30 19:24:29 +05:30
Ketan Sharma
2e890e4d6f [WEB-2294] fix: remove 'Add Project' button from archives route and remove it from the dropdown in header (#5469)
* fix: remove 'Add Project' button from archives route and remove it from the dropdown in header

* Improved Code Logic

* Fixed Clear All Button and UI Fixes
2024-08-30 19:08:35 +05:30
rahulramesha
c1d3da0cab use-platform-os hook optimization to not cause re renders (#5453) 2024-08-30 19:05:22 +05:30
rahulramesha
4598b1b49d [WEB-2341] feat: Add display filters and display properties to create/update view dialog (#5451)
* Add display filters and display properties to create view dialog

* revert back display filter selection change
2024-08-30 19:04:38 +05:30
rahulramesha
693085577d [WEB-2316] chore: Render Tooltips and Drop downs in certain places on hover hover to improve rendering performance (#5456)
* render tooltips and dropdowns in certain places post hover to improve performance

* fix useEffect hooks
2024-08-29 21:07:49 +05:30
Anmol Singh Bhatia
33ab6029dc fix: intake issue accept modal (#5465) 2024-08-29 19:26:26 +05:30
Ketan Sharma
dc2e7ca3d5 increase z-index from z-20 to z-[21] in dropdown.tsx (#5446) 2024-08-29 19:25:55 +05:30
Mihir
b14a919c35 [WEB-2145] chore: added copy button for intake issues (#5455)
* chore: added copy button for intake issues

* Updated button UX

Updated button UX and handleCopyIssue function

* Removed commented code
2024-08-29 18:22:02 +05:30
Nikhil
6d8ba9dfa3 chore: add migration on svg (#5464) 2024-08-29 15:13:17 +05:30
Nikhil
0fbe4c4de2 chore: limit svg uploads (#5462)
* fix: limit svg file uploads

* chore: limit svg uploads
2024-08-29 13:31:41 +05:30
Nikhil
22a214795d chore: user and profile serializers (#5459)
* fix: user serializer

* chore: remove __all__from serializers
2024-08-29 13:31:13 +05:30
Aaryan Khandelwal
f843a5153b fix: version history editor overflow (#5461) 2024-08-29 12:49:59 +05:30
Anmol Singh Bhatia
3c78292618 [WEB-2344] fix: quick action hover (#5449)
* fix: quick action hover

* chore: code refactor
2024-08-28 20:02:14 +05:30
Aaryan Khandelwal
de273dd618 [WEB-2293] refactor: version editor (#5454)
* refactor: version editor

* chore: added missing props
2024-08-28 19:56:28 +05:30
Aaryan Khandelwal
0cce39ec7c [WEB-2338] chore: handle untitled page breadcrumbs (#5445)
* chore: handle untitle page titles

* chore: store page title in a const
2024-08-28 14:35:45 +05:30
Anmol Singh Bhatia
3ee14771e7 [PWA-1] fix: pwa app sidebar redirection (#5416)
* fix: pwa app sidebar redirection

* chore: pwa app sidebar improvement
2024-08-28 14:33:10 +05:30
Anmol Singh Bhatia
59697d34f8 [PWA-17] chore: project view list header improvement (#5425)
* chore: project view list header improvement

* chore: code refactor
2024-08-28 14:31:27 +05:30
Aaryan Khandelwal
7efda1c392 [WEB-2050] dev: added new information panels to a page (#5409)
* dev: added new information panels to pages

* refactor: update function name
2024-08-28 14:08:29 +05:30
Aaryan Khandelwal
fb2a04dc14 chore: add authorization to restore version (#5444) 2024-08-28 14:03:01 +05:30
Mohamed Ashraf
e6baa6fa2c chore: add IDX configuration so anyone can edit the project from idx.google.com (#5398)
* chore: add IDX configuration so anyone can edit the project from idx.google.com

* chore: add python, postgres and redis to the idx config
2024-08-28 13:52:25 +05:30
Prateek Shourya
9372677f0c [WEB-2343] fix: click events in spreadsheet layout quick action menu. (#5443) 2024-08-27 22:11:25 +05:30
Akshita Goyal
716300d964 [WEB-2114]: Chore: project cycle optimization (#5430)
* chore: project cycle optimization

* fix: typo

* chore: changed the label typo

* feat: intergrated optimized api

* chore: added every key as plural

* fix: productivity dropdown

* fix: removed logging

* fix: handled loading

* fix: loaders

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-08-27 19:50:20 +05:30
Aaryan Khandelwal
b22bdef9e1 chore: move version history editor to edition specific structure (#5441) 2024-08-27 19:50:07 +05:30
guru_sainath
23dcdd6407 [WEB-2115] chore: implemented global paginator and handled project issues pagination v1 (#5432)
* chore: implemented global paginator and handled project issues paginated v1

* chore: updated order_by

* chore: updated updated_at parameter to updated_at__gte

* chore: changed updated_at__gte default value to None
2024-08-27 19:12:55 +05:30
guru_sainath
09209694a4 [WEB-2329] chore: updated UI for module and cycle detail overview (#5435)
* chore: updated UI for module and cycle detail overview

* chore: z-index issue in sheet
2024-08-27 17:45:17 +05:30
Prateek Shourya
88013e3b06 [WEB-2312] chore: minor UI and UX copy improvements. (#5438) 2024-08-27 17:27:59 +05:30
sriram veeraghanta
51fba04226 fix: intake issue bugfixes on external apis 2024-08-27 16:58:42 +05:30
Anmol Singh Bhatia
f39fc3e9ca [PWA-12] chore: project analytics modal header improvement (#5427)
* chore: project analytics modal header improvement

* chore: code refactor
2024-08-27 16:49:52 +05:30
Anmol Singh Bhatia
e3cd7050fa [PWA-11] fix: pwa kanban layout block (#5426)
* fix: pwa kanban layout block

* chore: code refactor
2024-08-27 16:47:49 +05:30
Anmol Singh Bhatia
a19226ac64 fix: intake issue create and update modal (#5434) 2024-08-27 16:47:05 +05:30
rahulramesha
e7a41b3c32 redirect to issues page post deletion (#5437) 2024-08-27 16:46:53 +05:30
Ketan Sharma
224c8bc0a1 add vertical padding to div containing SidebarUserMenu (#5436) 2024-08-27 16:08:50 +05:30
Prateek Shourya
83ceba3166 [WEB-2332 | 2295] style: UI improvements. (#5433)
* [WEB-2332] style: minor layout improvements.

* [WEB-2295] style: fix scrollbar padding in workspace list section of profile settings.

* style: add `app-container` css.
2024-08-27 14:26:09 +05:30
Ketan Sharma
08c9bd7949 change z-index from 5 to 1 (#5428) 2024-08-27 12:54:12 +05:30
Ketan Sharma
4689ebe2ba Fix: Error Toast Message for Issue Attachment (#5424) 2024-08-26 16:58:32 +05:30
rahulramesha
0dce67b149 fix to use the correct created by while checking if the current user is the creator of the inbox issue (#5422) 2024-08-26 16:57:01 +05:30
Akshita Goyal
803992cc98 [WEB-1936] fix: flicker issue in issues list layout (#5412)
* fix: flicker issue in issues list layout

* fix: formatting

* fix: optimization

* fix: added optional chaining for safety
2024-08-26 16:56:21 +05:30
rahulramesha
890379b64f Make quick action dropdowns use capture phase of the event to trigger closure on outside click (#5414) 2024-08-26 14:40:11 +05:30
Aaryan Khandelwal
a0ed51c845 [WEB-2293] feat: pages version history (#5417)
* chore: project page version

* feat: page version history implemented

* chore: hide save button when version history overlay is active

* refactor: updated navigation logic

* chore: added error states

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-08-26 14:03:55 +05:30
Anmol Singh Bhatia
d802316c5c [WEB-2263] fix: god mode wrong credentials error message banner (#5407)
* fix: god mode wrong credentials error message banner

* chore: code refactor
2024-08-26 13:07:00 +05:30
Anmol Singh Bhatia
bd3f117545 [PWA-2] fix: pwa input zoom effect (#5402)
* fix: pwa input zoom effect

* fix: pwa input zoom effect

* fix: pwa input zoom effect

* fix: pwa sticky issue comment

* chore: code refactor

* chore: code refactor
2024-08-26 13:02:30 +05:30
Anmol Singh Bhatia
9065932c86 fix: pwa sticky issue comment (#5419) 2024-08-23 19:06:12 +05:30
Prateek Shourya
700f3ee823 chore: pricing update. (#5410) 2024-08-23 18:04:55 +05:30
rahulramesha
adf891bcba [WEB-2150] fix: issue selection redirect alert (#5406)
* fix issue selection redirect alert

* change message content for user prompt
2024-08-23 18:00:15 +05:30
Anmol Singh Bhatia
48e9042970 [WEB-2289] fix: email notification settings form validation (#5413)
* fix: email notification validation

* chore: code refactor
2024-08-22 17:33:14 +05:30
sriram veeraghanta
460003c7f5 fix: removing permissions from user notifications 2024-08-22 16:47:34 +05:30
Anmol Singh Bhatia
9f20936c86 fix: project intake viewer permission validation (#5408) 2024-08-22 16:11:53 +05:30
Prateek Shourya
ae9267e0b0 chore: remove next pwa (#5396) 2024-08-21 17:54:13 +05:30
sriram veeraghanta
b3bff4c72c fix: removing proxy url 2024-08-21 17:40:39 +05:30
Prateek Shourya
36c9f8bd83 chore: fix z-index issue in memeber picker. (#5404) 2024-08-21 16:52:53 +05:30
rahulramesha
696b1340c5 [WEB-2133] fix : Remove inbox delete option for members (#5395)
* remove inbox delete option for members

* change inbox issue delete condition slightly
2024-08-21 16:50:03 +05:30
Aaryan Khandelwal
881d0525cc refactor: ai menu (#5400) 2024-08-21 16:19:28 +05:30
Anmol Singh Bhatia
c100c0bd85 fix: empty state comic button responsiveness (#5401) 2024-08-21 16:17:35 +05:30
Akshita Goyal
5fc99c9ce5 [WEB-1986] fix: remove the user favourites when archived a particular entity (#5388)
* chore: pages custom error codes

* fix: project archive issue

* fix: delete issue + dropdown z-index fix

* fix: import issue

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-08-21 13:20:22 +05:30
Anmol Singh Bhatia
f789c72cac fix: workspace inbox read endpoint permission (#5391) 2024-08-20 19:49:48 +05:30
Bavisetti Narayan
650328c6f2 [WEB-1986] fix: remove the user favourites when archived a particular entity (#5387)
* chore: pages custom error codes

* fix: view role permission
2024-08-20 19:40:48 +05:30
Bavisetti Narayan
ffbc5942da chore: export issues permission changed (#5392) 2024-08-20 19:39:24 +05:30
Prateek Shourya
854a90c3f1 chore: minor UI improvement in issue modal. (#5390) 2024-08-20 15:50:29 +05:30
M. Palanikannan
d9b0fe2aaa fix: placeholder for list items (#5389) 2024-08-20 15:03:16 +05:30
Bavisetti Narayan
6748065456 [WEB-1980] feat: user recent visited entities (#5211)
* feat: recent visited

* chore: recent visited 20 records

* chore: removed the old table

* chore: view detail endpoint
2024-08-19 20:28:19 +05:30
Prateek Shourya
e6526a31c8 chore: create/ update issue modal restructure. (#5385)
* chore: create/ update issue modal restructure.

* chore: minor UI improvements.
2024-08-19 19:38:28 +05:30
Akshat Jain
bf08d21da6 Version update for postgres and python (#5378)
* version updates for pyrhon and postgres

* updated version for python and postgres

* Update docker-compose.yml
2024-08-19 16:27:36 +05:30
Prateek Shourya
807dfec7ad chore: components restructure and improvements (#5383)
* chore: update issue identifier component.

* fix: browser tab closed on closing emoji picker issue fixed.

* chore: revert back changes in logo props.

* chore: update sortable.

* chore: minor componenets restructuring.

* minor ui update.

* fix: issue identifier display in command palette search.

* style: issue activity icons consistency.
2024-08-19 13:40:19 +05:30
Henit Chobisa
c829b52c0f fix: issue serializer breaking (#5379) 2024-08-16 20:46:42 +05:30
Prateek Shourya
f675ea3f5d chore: rename active filters to applied filters (#5377) 2024-08-16 18:15:55 +05:30
sriram veeraghanta
02e18b4293 fix: turbo upgrade 2024-08-16 17:58:45 +05:30
sriram veeraghanta
3729011cb0 fix: merge conflicts from preview 2024-08-16 17:55:08 +05:30
sriram veeraghanta
9e565df11b fix: apiserver build errors 2024-08-16 17:53:41 +05:30
Prateek Shourya
4ca45a971c chore: issue filters restructuring. (#5372) 2024-08-16 16:48:00 +05:30
rahulramesha
89633d8b2a fix sort order in states for space app (#5374) 2024-08-16 16:47:07 +05:30
Anmol Singh Bhatia
0a1c656865 [WEB-2126] chore: guest and viewer role permission (#5347)
* chore: user store code refactor

* chore: general unauthorized screen asset added

* chore: workspace setting sidebar options updated for guest and viewer

* chore: NotAuthorizedView component code updated

* chore: project setting layout code refactor

* chore: workspace setting members and exports page permission validation added

* chore: workspace members and exports settings page improvement

* chore: project invite modal updated

* chore: workspace setting unauthorized access empty state

* chore: workspace setting unauthorized access empty state

* chore: project settings sidebar permission updated

* fix: project settings user role permission updated

* chore: app sidebar role permission validation updated

* chore: app sidebar role permission validation

* chore: disabled page empty state validation

* chore: app sidebar add project improvement

* chore: guest role changes

* fix: user favorite

* chore: changed pages permission

* chore: guest role changes

* fix: app sidebar project item permission

* fix: project setting empty state flicker

* fix: workspace setting empty state flicker

* chore: granted notification permission to viewer

* chore: project invite and edit validation updated

* chore: favorite validation added for guest and viewer role

* chore: create view validation updated

* chore: views permission changes

* chore: create view empty state validation updated

* chore: created ENUM for permissions

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: Bavisetti Narayan <72156168+NarayanBavisetti@users.noreply.github.com>
2024-08-16 16:35:05 +05:30
Anmol Singh Bhatia
d60e988ca1 fix: issue delete notification message updated (#5373) 2024-08-16 16:30:54 +05:30
Aaryan Khandelwal
a36adae995 [WEB-2047] dev: pages side menu refactor (#5371)
* dev: pages ai menu

* chore: remove unused tasks
2024-08-16 16:17:33 +05:30
sriram veeraghanta
1757b360f3 fix: type fixes 2024-08-16 14:24:58 +05:30
Akshat Jain
8e87c48249 fix: adding secret key variable in newline (#5361)
* fix: adding secret key variable in newline

adding secret key variable in newline in api server env file and setting default value for `HARD_DELETE_AFTER_DAYS`

* added newline at EOF
2024-08-16 11:57:52 +05:30
Anmol Singh Bhatia
3e83eed398 [WEB-2233] fix: intake issue comment (#5368)
* fix: intake issue comment

* chore: issue comment improvement
2024-08-14 19:38:37 +05:30
Henit Chobisa
4a71eef72e feat: added put request for issues api for upserting issues (#5367) 2024-08-14 18:25:49 +05:30
vamsi
a5a4496800 fix: adding throttling at base api view for external apis 2024-08-14 17:41:40 +05:30
vamsi
172f39e231 fix: adding service token throttle class 2024-08-14 17:38:05 +05:30
pablohashescobar
56ea45f44c chore: migrations for constraints 2024-08-14 14:26:44 +05:30
pablohashescobar
729bad4344 fix: migration 2024-08-14 13:57:59 +05:30
dependabot[bot]
5f26ce2466 chore(deps): bump axios from 1.7.2 to 1.7.4 (#5364)
Bumps [axios](https://github.com/axios/axios) from 1.7.2 to 1.7.4.
- [Release notes](https://github.com/axios/axios/releases)
- [Changelog](https://github.com/axios/axios/blob/v1.x/CHANGELOG.md)
- [Commits](https://github.com/axios/axios/compare/v1.7.2...v1.7.4)

---
updated-dependencies:
- dependency-name: axios
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-08-14 13:41:16 +05:30
guru_sainath
c02a54ef31 [WEB-2214] chore: migration for user favorite, file asset, and deploy board (#5339)
* chore: migrations for user favorite, file asset, and deply boards

* fix: migration fixes

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-08-14 13:07:08 +05:30
Anmol Singh Bhatia
d9c9d85d38 [WEB-2221] fix: app sidebar and favorites improvement (#5357)
* fix: project collapsible toggle

* fix: project favorite redirection

* chore: favorite redirection scroll into view implementation

* fix: use favorite item details project details
2024-08-14 12:53:53 +05:30
Prateek Shourya
49a895f117 improvement: merge quick add logic for all layouts. (#5323) 2024-08-07 20:54:08 +05:30
Prateek Shourya
333a989b1a chore: components restructuring and UI improvements. (#5285)
* chore: components restructuring and minor UI improvements.

* chore: minor UI improvements fro icons and member dropdown.

* chore: update issue identifier.

* chore: rename `Issue Extra Property` to `Issue Additional Property`

* chore: fix popovers placement issue on components with overflow.

* chore: add `scrollbar-xs`

* chore: add `xs` size for input and textarea components.

* chore: update `sortable` to return back `movedItem` in the onChange callback.

* chore: minor UI adjustments for radio-select.

* chore: update outside click delay to 1ms.
2024-08-05 20:42:14 +05:30
sriram veeraghanta
707570ca7a Merge pull request #5041 from makeplane/preview
release: v0.22-dev
2024-07-05 13:28:45 +05:30
sriram veeraghanta
c76af7d7d6 Merge pull request #4688 from makeplane/preview
release: v0.21-dev
2024-06-03 18:54:06 +05:30
sriram veeraghanta
1dcea9bcc8 Merge pull request #4569 from makeplane/preview
release: v0.20-dev
2024-05-23 19:55:06 +05:30
sriram veeraghanta
da957e06b6 Merge pull request #4349 from makeplane/preview
release: v0.19-dev
2024-05-03 20:36:07 +05:30
sriram veeraghanta
a0b9596cb4 Merge pull request #4239 from makeplane/preview
chore:version update
2024-04-19 12:01:15 +05:30
sriram veeraghanta
f71e8a3a0f Merge pull request #4238 from makeplane/preview
release: v0.18-dev
2024-04-19 11:56:03 +05:30
sriram veeraghanta
002fb4547b Merge pull request #4107 from makeplane/preview
release: v0.17-dev
2024-04-02 20:07:48 +05:30
sriram veeraghanta
c1b1ba35c1 Merge pull request #3878 from makeplane/preview
release: v0.16-dev
2024-03-05 20:04:08 +05:30
sriram veeraghanta
4566d6e80c Merge pull request #3697 from makeplane/preview
release: 0.15.4-dev
2024-02-19 19:30:06 +05:30
sriram veeraghanta
e8d359e625 Merge pull request #3674 from makeplane/preview
fix: build branch docker images push on release
2024-02-15 14:35:32 +05:30
sriram veeraghanta
351eba8d61 Merge pull request #3671 from makeplane/preview
release: peek overview issue description initial load bug (#3670)
2024-02-15 03:25:30 +05:30
sriram veeraghanta
1e27e37b51 Merge pull request #3666 from makeplane/preview
release: v0.15.2-dev
2024-02-14 19:41:55 +05:30
sriram veeraghanta
7df2e9cf11 Merge pull request #3632 from makeplane/preview
release: v0.15.1-dev
2024-02-12 20:59:56 +05:30
sriram veeraghanta
c6e3f1b932 Merge pull request #3535 from makeplane/preview
release: 0.15-dev
2024-02-01 15:01:49 +05:30
1331 changed files with 44840 additions and 26712 deletions

View File

@@ -8,6 +8,13 @@ PGDATA="/var/lib/postgresql/data"
REDIS_HOST="plane-redis"
REDIS_PORT="6379"
# RabbitMQ Settings
RABBITMQ_HOST="plane-mq"
RABBITMQ_PORT="5672"
RABBITMQ_USER="plane"
RABBITMQ_PASSWORD="plane"
RABBITMQ_VHOST="plane"
# AWS Settings
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"

View File

@@ -1,59 +0,0 @@
/**
* Adds three new lint plugins over the existing configuration:
* This is used to lint staged files only.
* We should remove this file once the entire codebase follows these rules.
*/
module.exports = {
root: true,
extends: [
"custom",
],
parser: "@typescript-eslint/parser",
settings: {
"import/resolver": {
typescript: {},
node: {
moduleDirectory: ["node_modules", "."],
},
},
},
rules: {
"import/order": [
"error",
{
groups: ["builtin", "external", "internal", "parent", "sibling"],
pathGroups: [
{
pattern: "react",
group: "external",
position: "before",
},
{
pattern: "lucide-react",
group: "external",
position: "after",
},
{
pattern: "@headlessui/**",
group: "external",
position: "after",
},
{
pattern: "@plane/**",
group: "external",
position: "after",
},
{
pattern: "@/**",
group: "internal",
},
],
pathGroupsExcludedImportTypes: ["builtin", "internal", "react"],
alphabetize: {
order: "asc",
caseInsensitive: true,
},
},
],
},
};

View File

@@ -1,10 +0,0 @@
module.exports = {
root: true,
// This tells ESLint to load the config from the package `eslint-config-custom`
extends: ["custom"],
settings: {
next: {
rootDir: ["web/", "space/", "admin/"],
},
},
};

1
.gitattributes vendored Normal file
View File

@@ -0,0 +1 @@
*.sh text eol=lf

126
.github/actions/build-push-ce/action.yml vendored Normal file
View File

@@ -0,0 +1,126 @@
name: "Build and Push Docker Image"
description: "Reusable action for building and pushing Docker images"
inputs:
docker-username:
description: "The Dockerhub username"
required: true
docker-token:
description: "The Dockerhub Token"
required: true
# Docker Image Options
docker-image-owner:
description: "The owner of the Docker image"
required: true
docker-image-name:
description: "The name of the Docker image"
required: true
build-context:
description: "The build context"
required: true
default: "."
dockerfile-path:
description: "The path to the Dockerfile"
required: true
build-args:
description: "The build arguments"
required: false
default: ""
# Buildx Options
buildx-driver:
description: "Buildx driver"
required: true
default: "docker-container"
buildx-version:
description: "Buildx version"
required: true
default: "latest"
buildx-platforms:
description: "Buildx platforms"
required: true
default: "linux/amd64"
buildx-endpoint:
description: "Buildx endpoint"
required: true
default: "default"
# Release Build Options
build-release:
description: "Flag to publish release"
required: false
default: "false"
build-prerelease:
description: "Flag to publish prerelease"
required: false
default: "false"
release-version:
description: "The release version"
required: false
default: "latest"
runs:
using: "composite"
steps:
- name: Set Docker Tag
shell: bash
env:
IMG_OWNER: ${{ inputs.docker-image-owner }}
IMG_NAME: ${{ inputs.docker-image-name }}
BUILD_RELEASE: ${{ inputs.build-release }}
IS_PRERELEASE: ${{ inputs.build-prerelease }}
REL_VERSION: ${{ inputs.release-version }}
run: |
FLAT_BRANCH_VERSION=$(echo "${{ github.ref_name }}" | sed 's/[^a-zA-Z0-9.-]//g')
if [ "${{ env.BUILD_RELEASE }}" == "true" ]; then
semver_regex="^v([0-9]+)\.([0-9]+)\.([0-9]+)(-[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*)?$"
if [[ ! ${{ env.REL_VERSION }} =~ $semver_regex ]]; then
echo "Invalid Release Version Format : ${{ env.REL_VERSION }}"
echo "Please provide a valid SemVer version"
echo "e.g. v1.2.3 or v1.2.3-alpha-1"
echo "Exiting the build process"
exit 1 # Exit with status 1 to fail the step
fi
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:${{ env.REL_VERSION }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:latest
else
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:${FLAT_BRANCH_VERSION}
fi
echo "DOCKER_TAGS=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ inputs.docker-username }}
password: ${{ inputs.docker-token}}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ inputs.buildx-driver }}
version: ${{ inputs.buildx-version }}
endpoint: ${{ inputs.buildx-endpoint }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Docker Image
uses: docker/build-push-action@v5.1.0
with:
context: ${{ inputs.build-context }}
file: ${{ inputs.dockerfile-path }}
platforms: ${{ inputs.buildx-platforms }}
tags: ${{ env.DOCKER_TAGS }}
push: true
build-args: ${{ inputs.build-args }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ inputs.docker-username }}
DOCKER_PASSWORD: ${{ inputs.docker-token }}

View File

@@ -1,21 +1,45 @@
name: Branch Build
name: Branch Build CE
on:
workflow_dispatch:
push:
branches:
- master
- preview
release:
types: [released, prereleased]
inputs:
build_type:
description: "Type of build to run"
required: true
type: choice
default: "Build"
options:
- "Build"
- "Release"
releaseVersion:
description: "Release Version"
type: string
default: v0.0.0
isPrerelease:
description: "Is Pre-release"
type: boolean
default: false
required: true
arm64:
description: "Build for ARM64 architecture"
required: false
default: false
type: boolean
# push:
# branches:
# - master
env:
TARGET_BRANCH: ${{ github.ref_name || github.event.release.target_commitish }}
TARGET_BRANCH: ${{ github.ref_name }}
ARM64_BUILD: ${{ github.event.inputs.arm64 }}
BUILD_TYPE: ${{ github.event.inputs.build_type }}
RELEASE_VERSION: ${{ github.event.inputs.releaseVersion }}
IS_PRERELEASE: ${{ github.event.inputs.isPrerelease }}
jobs:
branch_build_setup:
name: Build Setup
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
@@ -27,12 +51,25 @@ jobs:
build_admin: ${{ steps.changed_files.outputs.admin_any_changed }}
build_space: ${{ steps.changed_files.outputs.space_any_changed }}
build_web: ${{ steps.changed_files.outputs.web_any_changed }}
build_live: ${{ steps.changed_files.outputs.live_any_changed }}
dh_img_web: ${{ steps.set_env_variables.outputs.DH_IMG_WEB }}
dh_img_space: ${{ steps.set_env_variables.outputs.DH_IMG_SPACE }}
dh_img_admin: ${{ steps.set_env_variables.outputs.DH_IMG_ADMIN }}
dh_img_live: ${{ steps.set_env_variables.outputs.DH_IMG_LIVE }}
dh_img_backend: ${{ steps.set_env_variables.outputs.DH_IMG_BACKEND }}
dh_img_proxy: ${{ steps.set_env_variables.outputs.DH_IMG_PROXY }}
build_type: ${{steps.set_env_variables.outputs.BUILD_TYPE}}
build_release: ${{ steps.set_env_variables.outputs.BUILD_RELEASE }}
build_prerelease: ${{ steps.set_env_variables.outputs.BUILD_PRERELEASE }}
release_version: ${{ steps.set_env_variables.outputs.RELEASE_VERSION }}
steps:
- id: set_env_variables
name: Set Environment Variables
run: |
if [ "${{ env.TARGET_BRANCH }}" == "master" ] || [ "${{ github.event_name }}" == "release" ]; then
if [ "${{ env.ARM64_BUILD }}" == "true" ] || ([ "${{ env.BUILD_TYPE }}" == "Release" ] && [ "${{ env.IS_PRERELEASE }}" != "true" ]); then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
@@ -43,7 +80,43 @@ jobs:
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
fi
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
BR_NAME=$( echo "${{ env.TARGET_BRANCH }}" |sed 's/[^a-zA-Z0-9.-]//g')
echo "TARGET_BRANCH=$BR_NAME" >> $GITHUB_OUTPUT
echo "DH_IMG_WEB=plane-frontend" >> $GITHUB_OUTPUT
echo "DH_IMG_SPACE=plane-space" >> $GITHUB_OUTPUT
echo "DH_IMG_ADMIN=plane-admin" >> $GITHUB_OUTPUT
echo "DH_IMG_LIVE=plane-live" >> $GITHUB_OUTPUT
echo "DH_IMG_BACKEND=plane-backend" >> $GITHUB_OUTPUT
echo "DH_IMG_PROXY=plane-proxy" >> $GITHUB_OUTPUT
echo "BUILD_TYPE=${{env.BUILD_TYPE}}" >> $GITHUB_OUTPUT
BUILD_RELEASE=false
BUILD_PRERELEASE=false
RELVERSION="latest"
if [ "${{ env.BUILD_TYPE }}" == "Release" ]; then
FLAT_RELEASE_VERSION=$(echo "${{ env.RELEASE_VERSION }}" | sed 's/[^a-zA-Z0-9.-]//g')
echo "FLAT_RELEASE_VERSION=${FLAT_RELEASE_VERSION}" >> $GITHUB_OUTPUT
semver_regex="^v([0-9]+)\.([0-9]+)\.([0-9]+)(-[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*)?$"
if [[ ! $FLAT_RELEASE_VERSION =~ $semver_regex ]]; then
echo "Invalid Release Version Format : $FLAT_RELEASE_VERSION"
echo "Please provide a valid SemVer version"
echo "e.g. v1.2.3 or v1.2.3-alpha-1"
echo "Exiting the build process"
exit 1 # Exit with status 1 to fail the step
fi
BUILD_RELEASE=true
RELVERSION=$FLAT_RELEASE_VERSION
if [ "${{ env.IS_PRERELEASE }}" == "true" ]; then
BUILD_PRERELEASE=true
fi
fi
echo "BUILD_RELEASE=${BUILD_RELEASE}" >> $GITHUB_OUTPUT
echo "BUILD_PRERELEASE=${BUILD_PRERELEASE}" >> $GITHUB_OUTPUT
echo "RELEASE_VERSION=${RELVERSION}" >> $GITHUB_OUTPUT
- id: checkout_files
name: Checkout Files
@@ -61,281 +134,250 @@ jobs:
admin:
- admin/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
space:
- space/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
web:
- web/**
- packages/**
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
live:
- live/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
branch_build_push_web:
if: ${{ needs.branch_build_setup.outputs.build_web == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
FRONTEND_TAG: makeplane/plane-frontend:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Frontend Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-frontend:stable,makeplane/plane-frontend:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-frontend:latest
else
TAG=${{ env.FRONTEND_TAG }}
fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0
with:
context: .
file: ./web/Dockerfile.web
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.FRONTEND_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_admin:
if: ${{ needs.branch_build_setup.outputs.build_admin== 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_admin == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Admin Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
ADMIN_TAG: makeplane/plane-admin:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Admin Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-admin:stable,makeplane/plane-admin:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-admin:latest
else
TAG=${{ env.ADMIN_TAG }}
fi
echo "ADMIN_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0
- name: Admin Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./admin/Dockerfile.admin
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.ADMIN_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_admin }}
build-context: .
dockerfile-path: ./admin/Dockerfile.admin
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_web:
if: ${{ needs.branch_build_setup.outputs.build_web == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Web Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Web Build and Push
uses: ./.github/actions/build-push-ce
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_web }}
build-context: .
dockerfile-path: ./web/Dockerfile.web
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_space:
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Space Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
SPACE_TAG: makeplane/plane-space:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Space Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-space:stable,makeplane/plane-space:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-space:latest
else
TAG=${{ env.SPACE_TAG }}
fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Space to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Space Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./space/Dockerfile.space
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.SPACE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_space }}
build-context: .
dockerfile-path: ./space/Dockerfile.space
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_live:
if: ${{ needs.branch_build_setup.outputs.build_live == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Live Collaboration Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Live Build and Push
uses: ./.github/actions/build-push-ce
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_live }}
build-context: .
dockerfile-path: ./live/Dockerfile.live
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_apiserver:
if: ${{ needs.branch_build_setup.outputs.build_apiserver == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_apiserver == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push API Server Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
BACKEND_TAG: makeplane/plane-backend:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Backend Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-backend:stable,makeplane/plane-backend:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-backend:latest
else
TAG=${{ env.BACKEND_TAG }}
fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Backend Build and Push
uses: ./.github/actions/build-push-ce
with:
context: ./apiserver
file: ./apiserver/Dockerfile.api
platforms: ${{ env.BUILDX_PLATFORMS }}
push: true
tags: ${{ env.BACKEND_TAG }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_backend }}
build-context: ./apiserver
dockerfile-path: ./apiserver/Dockerfile.api
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_proxy:
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Proxy Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
PROXY_TAG: makeplane/plane-proxy:${{ needs.branch_build_setup.outputs.gh_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Proxy Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-proxy:stable,makeplane/plane-proxy:${{ github.event.release.tag_name }}
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-proxy:latest
else
TAG=${{ env.PROXY_TAG }}
fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Proxy Build and Push
uses: ./.github/actions/build-push-ce
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_proxy }}
build-context: ./nginx
dockerfile-path: ./nginx/Dockerfile
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
attach_assets_to_build:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Build' }}
name: Attach Assets to Build
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
- name: Attach Assets
id: attach_assets
uses: actions/upload-artifact@v4
with:
context: ./nginx
file: ./nginx/Dockerfile
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.PROXY_TAG }}
push: true
name: selfhost-assets
retention-days: 2
path: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env
publish_release:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Release' }}
name: Build Release
runs-on: ubuntu-20.04
needs:
[
branch_build_setup,
branch_build_push_admin,
branch_build_push_web,
branch_build_push_space,
branch_build_push_live,
branch_build_push_apiserver,
branch_build_push_proxy,
]
env:
REL_VERSION: ${{ needs.branch_build_setup.outputs.release_version }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
- name: Create Release
id: create_release
uses: softprops/action-gh-release@v2.0.8
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
with:
tag_name: ${{ env.REL_VERSION }}
name: ${{ env.REL_VERSION }}
draft: false
prerelease: ${{ env.IS_PRERELEASE }}
generate_release_notes: true
files: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env

View File

@@ -8,30 +8,13 @@ on:
env:
CURRENT_BRANCH: ${{ github.ref_name }}
SOURCE_BRANCH: ${{ vars.SYNC_SOURCE_BRANCH_NAME }} # The sync branch such as "sync/ce"
TARGET_BRANCH: ${{ vars.SYNC_TARGET_BRANCH_NAME }} # The target branch that you would like to merge changes like develop
TARGET_BRANCH: "preview" # The target branch that you would like to merge changes like develop
GITHUB_TOKEN: ${{ secrets.ACCESS_TOKEN }} # Personal access token required to modify contents and workflows
REVIEWER: ${{ vars.SYNC_PR_REVIEWER }}
ACCOUNT_USER_NAME: ${{ vars.ACCOUNT_USER_NAME }}
ACCOUNT_USER_EMAIL: ${{ vars.ACCOUNT_USER_EMAIL }}
jobs:
Check_Branch:
runs-on: ubuntu-latest
outputs:
BRANCH_MATCH: ${{ steps.check-branch.outputs.MATCH }}
steps:
- name: Check if current branch matches the secret
id: check-branch
run: |
if [ "$CURRENT_BRANCH" = "$SOURCE_BRANCH" ]; then
echo "MATCH=true" >> $GITHUB_OUTPUT
else
echo "MATCH=false" >> $GITHUB_OUTPUT
fi
Create_PR:
if: ${{ needs.Check_Branch.outputs.BRANCH_MATCH == 'true' }}
needs: [Check_Branch]
create_pull_request:
runs-on: ubuntu-latest
permissions:
pull-requests: write
@@ -59,11 +42,11 @@ jobs:
- name: Create PR to Target Branch
run: |
# get all pull requests and check if there is already a PR
PR_EXISTS=$(gh pr list --base $TARGET_BRANCH --head $SOURCE_BRANCH --state open --json number | jq '.[] | .number')
PR_EXISTS=$(gh pr list --base $TARGET_BRANCH --head $CURRENT_BRANCH --state open --json number | jq '.[] | .number')
if [ -n "$PR_EXISTS" ]; then
echo "Pull Request already exists: $PR_EXISTS"
else
echo "Creating new pull request"
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $SOURCE_BRANCH --title "sync: community changes" --body "")
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $CURRENT_BRANCH --title "${{ vars.SYNC_PR_TITLE }}" --body "")
echo "Pull Request created: $PR_URL"
fi

View File

16
.idx/dev.nix Normal file
View File

@@ -0,0 +1,16 @@
{ pkgs, ... }: {
# Which nixpkgs channel to use.
channel = "stable-23.11"; # or "unstable"
# Use https://search.nixos.org/packages to find packages
packages = [
pkgs.nodejs_20
pkgs.python3
];
services.docker.enable = true;
services.postgres.enable = true;
services.redis.enable = true;
}

View File

@@ -1,3 +0,0 @@
{
"*.{ts,tsx,js,jsx}": ["eslint -c ./.eslintrc-staged.js", "prettier --check"]
}

View File

@@ -4,7 +4,7 @@ Thank you for showing an interest in contributing to Plane! All kinds of contrib
## Submitting an issue
Before submitting a new issue, please search the [issues](https://github.com/makeplane/plane/issues) tab. Maybe an issue or discussion already exists and might inform you of workarounds. Otherwise, you can give new informplaneation.
Before submitting a new issue, please search the [issues](https://github.com/makeplane/plane/issues) tab. Maybe an issue or discussion already exists and might inform you of workarounds. Otherwise, you can give new information.
While we want to fix all the [issues](https://github.com/makeplane/plane/issues), before fixing a bug we need to be able to reproduce and confirm it. Please provide us with a minimal reproduction scenario using a repository or [Gist](https://gist.github.com/). Having a live, reproducible scenario gives us the information without asking questions back & forth with additional questions like:

View File

@@ -1,44 +1,39 @@
# Security Policy
# Security policy
This document outlines the security protocols and vulnerability reporting guidelines for the Plane project. Ensuring the security of our systems is a top priority, and while we work diligently to maintain robust protection, vulnerabilities may still occur. We highly value the communitys role in identifying and reporting security concerns to uphold the integrity of our systems and safeguard our users.
This document outlines security procedures and vulnerabilities reporting for the Plane project.
## Reporting a vulnerability
If you have identified a security vulnerability, submit your findings to [security@plane.so](mailto:security@plane.so).
Ensure your report includes all relevant information needed for us to reproduce and assess the issue. Include the IP address or URL of the affected system.
At Plane, we safeguarding the security of our systems with top priority. Despite our efforts, vulnerabilities may still exist. We greatly appreciate your assistance in identifying and reporting any such vulnerabilities to help us maintain the integrity of our systems and protect our clients.
To ensure a responsible and effective disclosure process, please adhere to the following:
To report a security vulnerability, please email us directly at security@plane.so with a detailed description of the vulnerability and steps to reproduce it. Please refrain from disclosing the vulnerability publicly until we have had an opportunity to review and address it.
- Maintain confidentiality and refrain from publicly disclosing the vulnerability until we have had the opportunity to investigate and address the issue.
- Refrain from running automated vulnerability scans on our infrastructure or dashboard without prior consent. Contact us to set up a sandbox environment if necessary.
- Do not exploit any discovered vulnerabilities for malicious purposes, such as accessing or altering user data.
- Do not engage in physical security attacks, social engineering, distributed denial of service (DDoS) attacks, spam campaigns, or attacks on third-party applications as part of your vulnerability testing.
## Out of Scope Vulnerabilities
## Out of scope
While we appreciate all efforts to assist in improving our security, please note that the following types of vulnerabilities are considered out of scope:
We appreciate your help in identifying vulnerabilities. However, please note that the following types of vulnerabilities are considered out of scope:
- Vulnerabilities requiring man-in-the-middle (MITM) attacks or physical access to a users device.
- Content spoofing or text injection issues without a clear attack vector or the ability to modify HTML/CSS.
- Issues related to email spoofing.
- Missing DNSSEC, CAA, or CSP headers.
- Absence of secure or HTTP-only flags on non-sensitive cookies.
- Attacks requiring MITM or physical access to a user's device.
- Content spoofing and text injection issues without demonstrating an attack vector or ability to modify HTML/CSS.
- Email spoofing.
- Missing DNSSEC, CAA, CSP headers.
- Lack of Secure or HTTP only flag on non-sensitive cookies.
## Our commitment
## Reporting Process
At Plane, we are committed to maintaining transparent and collaborative communication throughout the vulnerability resolution process. Here's what you can expect from us:
If you discover a vulnerability, please adhere to the following reporting process:
- **Response Time** <br/>
We will acknowledge receipt of your vulnerability report within three business days and provide an estimated timeline for resolution.
- **Legal Protection** <br/>
We will not initiate legal action against you for reporting vulnerabilities, provided you adhere to the reporting guidelines.
- **Confidentiality** <br/>
Your report will be treated with confidentiality. We will not disclose your personal information to third parties without your consent.
- **Recognition** <br/>
With your permission, we are happy to publicly acknowledge your contribution to improving our security once the issue is resolved.
- **Timely Resolution** <br/>
We are committed to working closely with you throughout the resolution process, providing timely updates as necessary. Our goal is to address all reported vulnerabilities swiftly, and we will actively engage with you to coordinate a responsible disclosure once the issue is fully resolved.
1. Email your findings to security@plane.so.
2. Refrain from running automated scanners on our infrastructure or dashboard without prior consent. Contact us to set up a sandbox environment if necessary.
3. Do not exploit the vulnerability for malicious purposes, such as downloading excessive data or altering user data.
4. Maintain confidentiality and refrain from disclosing the vulnerability until it has been resolved.
5. Avoid using physical security attacks, social engineering, distributed denial of service, spam, or third-party applications.
When reporting a vulnerability, please provide sufficient information to allow us to reproduce and address the issue promptly. Include the IP address or URL of the affected system, along with a detailed description of the vulnerability.
## Our Commitment
We are committed to promptly addressing reported vulnerabilities and maintaining open communication throughout the resolution process. Here's what you can expect from us:
- **Response Time:** We will acknowledge receipt of your report within three business days and provide an expected resolution date.
- **Legal Protection:** We will not pursue legal action against you for reporting vulnerabilities, provided you adhere to the reporting guidelines.
- **Confidentiality:** Your report will be treated with strict confidentiality. We will not disclose your personal information to third parties without your consent.
- **Progress Updates:** We will keep you informed of our progress in resolving the reported vulnerability.
- **Recognition:** With your permission, we will publicly acknowledge you as the discoverer of the vulnerability.
- **Timely Resolution:** We strive to resolve all reported vulnerabilities promptly and will actively participate in the publication process once the issue is resolved.
We appreciate your cooperation in helping us maintain the security of our systems and protecting our clients. Thank you for your contributions to our security efforts.
reference: https://supabase.com/.well-known/security.txt
We appreciate your help in ensuring the security of our platform. Your contributions are crucial to protecting our users and maintaining a secure environment. Thank you for working with us to keep Plane safe.

View File

@@ -1,52 +1,8 @@
module.exports = {
root: true,
extends: ["custom"],
extends: ["@plane/eslint-config/next.js"],
parser: "@typescript-eslint/parser",
settings: {
"import/resolver": {
typescript: {},
node: {
moduleDirectory: ["node_modules", "."],
},
},
parserOptions: {
project: true,
},
rules: {
"import/order": [
"error",
{
groups: ["builtin", "external", "internal", "parent", "sibling",],
pathGroups: [
{
pattern: "react",
group: "external",
position: "before",
},
{
pattern: "lucide-react",
group: "external",
position: "after",
},
{
pattern: "@headlessui/**",
group: "external",
position: "after",
},
{
pattern: "@plane/**",
group: "external",
position: "after",
},
{
pattern: "@/**",
group: "internal",
}
],
pathGroupsExcludedImportTypes: ["builtin", "internal", "react"],
alphabetize: {
order: "asc",
caseInsensitive: true,
},
},
],
},
}
};

View File

@@ -10,8 +10,9 @@ import {
// components
import { AuthenticationMethodCard } from "@/components/authentication";
// helpers
import { UpgradeButton } from "@/components/common/upgrade-button";
import { getBaseAuthenticationModes } from "@/helpers/authentication.helper";
// plane admin components
import { UpgradeButton } from "@/plane-admin/components/common";
// images
import OIDCLogo from "@/public/logos/oidc-logo.svg";
import SAMLLogo from "@/public/logos/saml-logo.svg";
@@ -27,24 +28,24 @@ export const getAuthenticationModes: (props: TGetBaseAuthenticationModeProps) =>
updateConfig,
resolvedTheme,
}) => [
...getBaseAuthenticationModes({ disabled, updateConfig, resolvedTheme }),
{
key: "oidc",
name: "OIDC",
description: "Authenticate your users via the OpenID Connect protocol.",
icon: <Image src={OIDCLogo} height={22} width={22} alt="OIDC Logo" />,
config: <UpgradeButton />,
unavailable: true,
},
{
key: "saml",
name: "SAML",
description: "Authenticate your users via the Security Assertion Markup Language protocol.",
icon: <Image src={SAMLLogo} height={22} width={22} alt="SAML Logo" className="pl-0.5" />,
config: <UpgradeButton />,
unavailable: true,
},
];
...getBaseAuthenticationModes({ disabled, updateConfig, resolvedTheme }),
{
key: "oidc",
name: "OIDC",
description: "Authenticate your users via the OpenID Connect protocol.",
icon: <Image src={OIDCLogo} height={22} width={22} alt="OIDC Logo" />,
config: <UpgradeButton />,
unavailable: true,
},
{
key: "saml",
name: "SAML",
description: "Authenticate your users via the Security Assertion Markup Language protocol.",
icon: <Image src={SAMLLogo} height={22} width={22} alt="SAML Logo" className="pl-0.5" />,
config: <UpgradeButton />,
unavailable: true,
},
];
export const AuthenticationModes: React.FC<TAuthenticationModeProps> = observer((props) => {
const { disabled, updateConfig } = props;

View File

@@ -0,0 +1 @@
export * from "./upgrade-button";

View File

@@ -0,0 +1,19 @@
import { enableStaticRendering } from "mobx-react";
// stores
import { CoreRootStore } from "@/store/root.store";
enableStaticRendering(typeof window === "undefined");
export class RootStore extends CoreRootStore {
constructor() {
super();
}
hydrate(initialData: any) {
super.hydrate(initialData);
}
resetOnSignOut() {
super.resetOnSignOut();
}
}

View File

@@ -2,15 +2,14 @@
import { FC, useEffect, useRef } from "react";
import { observer } from "mobx-react";
// hooks
import { HelpSection, SidebarMenu, SidebarDropdown } from "@/components/admin-sidebar";
import { useTheme } from "@/hooks/store";
import useOutsideClickDetector from "@/hooks/use-outside-click-detector";
// plane helpers
import { useOutsideClickDetector } from "@plane/helpers";
// components
import { HelpSection, SidebarMenu, SidebarDropdown } from "@/components/admin-sidebar";
// hooks
import { useTheme } from "@/hooks/store";
export interface IInstanceSidebar {}
export const InstanceSidebar: FC<IInstanceSidebar> = observer(() => {
export const InstanceSidebar: FC = observer(() => {
// store
const { isSidebarCollapsed, toggleSidebar } = useTheme();

View File

@@ -5,11 +5,13 @@ import { observer } from "mobx-react";
import { useTheme as useNextTheme } from "next-themes";
import { LogOut, UserCog2, Palette } from "lucide-react";
import { Menu, Transition } from "@headlessui/react";
// plane ui
import { Avatar } from "@plane/ui";
// hooks
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { useTheme, useUser } from "@/hooks/store";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { getFileURL } from "@/helpers/file.helper";
// hooks
import { useTheme, useUser } from "@/hooks/store";
// services
import { AuthService } from "@/services/auth.service";
@@ -122,7 +124,7 @@ export const SidebarDropdown = observer(() => {
<Menu.Button className="grid place-items-center outline-none">
<Avatar
name={currentUser.display_name}
src={currentUser.avatar ?? undefined}
src={getFileURL(currentUser.avatar_url)}
size={24}
shape="square"
className="!text-base"

View File

@@ -0,0 +1,29 @@
import { FC } from "react";
import { Info, X } from "lucide-react";
// helpers
import { TAuthErrorInfo } from "@/helpers/authentication.helper";
type TAuthBanner = {
bannerData: TAuthErrorInfo | undefined;
handleBannerData?: (bannerData: TAuthErrorInfo | undefined) => void;
};
export const AuthBanner: FC<TAuthBanner> = (props) => {
const { bannerData, handleBannerData } = props;
if (!bannerData) return <></>;
return (
<div className="relative flex items-center p-2 rounded-md gap-2 border border-custom-primary-100/50 bg-custom-primary-100/10">
<div className="w-4 h-4 flex-shrink-0 relative flex justify-center items-center">
<Info size={16} className="text-custom-primary-100" />
</div>
<div className="w-full text-sm font-medium text-custom-primary-100">{bannerData?.message}</div>
<div
className="relative ml-auto w-6 h-6 rounded-sm flex justify-center items-center transition-all cursor-pointer hover:bg-custom-primary-100/20 text-custom-primary-100/80"
onClick={() => handleBannerData && handleBannerData(undefined)}
>
<X className="w-4 h-4 flex-shrink-0" />
</div>
</div>
);
};

View File

@@ -1,3 +1,4 @@
export * from "./auth-banner";
export * from "./email-config-switch";
export * from "./password-config-switch";
export * from "./authentication-method-card";

View File

@@ -8,4 +8,3 @@ export * from "./empty-state";
export * from "./logo-spinner";
export * from "./page-header";
export * from "./code-block";
export * from "./upgrade-button";

View File

@@ -7,11 +7,7 @@ import { Button } from "@plane/ui";
import InstanceFailureDarkImage from "@/public/instance/instance-failure-dark.svg";
import InstanceFailureImage from "@/public/instance/instance-failure.svg";
type InstanceFailureViewProps = {
// mutate: () => void;
};
export const InstanceFailureView: FC<InstanceFailureViewProps> = () => {
export const InstanceFailureView: FC = () => {
const { resolvedTheme } = useTheme();
const instanceImage = resolvedTheme === "dark" ? InstanceFailureDarkImage : InstanceFailureImage;

View File

@@ -8,8 +8,16 @@ import { Button, Input, Spinner } from "@plane/ui";
// components
import { Banner } from "@/components/common";
// helpers
import {
authErrorHandler,
EAuthenticationErrorCodes,
EErrorAlertType,
TAuthErrorInfo,
} from "@/helpers/authentication.helper";
import { API_BASE_URL } from "@/helpers/common.helper";
import { AuthService } from "@/services/auth.service";
import { AuthBanner } from "../authentication";
// ui
// icons
@@ -53,6 +61,7 @@ export const InstanceSignInForm: FC = (props) => {
const [csrfToken, setCsrfToken] = useState<string | undefined>(undefined);
const [formData, setFormData] = useState<TFormData>(defaultFromData);
const [isSubmitting, setIsSubmitting] = useState(false);
const [errorInfo, setErrorInfo] = useState<TAuthErrorInfo | undefined>(undefined);
const handleFormChange = (key: keyof TFormData, value: string | boolean) =>
setFormData((prev) => ({ ...prev, [key]: value }));
@@ -91,6 +100,15 @@ export const InstanceSignInForm: FC = (props) => {
[formData.email, formData.password, isSubmitting]
);
useEffect(() => {
if (errorCode) {
const errorDetail = authErrorHandler(errorCode?.toString() as EAuthenticationErrorCodes);
if (errorDetail) {
setErrorInfo(errorDetail);
}
}
}, [errorCode]);
return (
<div className="flex-grow container mx-auto max-w-lg px-10 lg:max-w-md lg:px-5 py-10 lg:pt-28 transition-all">
<div className="relative flex flex-col space-y-6">
@@ -103,7 +121,11 @@ export const InstanceSignInForm: FC = (props) => {
</p>
</div>
{errorData.type && errorData?.message && <Banner type="error" message={errorData?.message} />}
{errorData.type && errorData?.message ? (
<Banner type="error" message={errorData?.message} />
) : (
<>{errorInfo && <AuthBanner bannerData={errorInfo} handleBannerData={(value) => setErrorInfo(value)} />}</>
)}
<form
className="space-y-4"

View File

@@ -1,21 +0,0 @@
"use client";
import React, { useEffect } from "react";
const useOutsideClickDetector = (ref: React.RefObject<HTMLElement>, callback: () => void) => {
const handleClick = (event: MouseEvent) => {
if (ref.current && !ref.current.contains(event.target as Node)) {
callback();
}
};
useEffect(() => {
document.addEventListener("mousedown", handleClick);
return () => {
document.removeEventListener("mousedown", handleClick);
};
});
};
export default useOutsideClickDetector;

View File

@@ -18,6 +18,7 @@ export const AdminLayout: FC<TAdminLayout> = observer((props) => {
const { children } = props;
// router
const router = useRouter();
// store hooks
const { isUserLoggedIn } = useUser();
useEffect(() => {

View File

@@ -1,8 +1,8 @@
"use client";
import { ReactNode, createContext } from "react";
// store
import { RootStore } from "@/store/root.store";
// plane admin store
import { RootStore } from "@/plane-admin/store/root.store";
let rootStore = new RootStore();

View File

@@ -1,5 +1,5 @@
// helpers
import { API_BASE_URL } from "helpers/common.helper";
import { API_BASE_URL } from "@/helpers/common.helper";
// services
import { APIService } from "@/services/api.service";

View File

@@ -1,7 +1,7 @@
// helpers
import { API_BASE_URL } from "helpers/common.helper";
// types
import type { IUser } from "@plane/types";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
// services
import { APIService } from "@/services/api.service";

View File

@@ -13,7 +13,7 @@ import { EInstanceStatus, TInstanceStatus } from "@/helpers/instance.helper";
// services
import { InstanceService } from "@/services/instance.service";
// root store
import { RootStore } from "@/store/root.store";
import { CoreRootStore } from "@/store/root.store";
export interface IInstanceStore {
// issues
@@ -46,7 +46,7 @@ export class InstanceStore implements IInstanceStore {
// service
instanceService;
constructor(private store: RootStore) {
constructor(private store: CoreRootStore) {
makeObservable(this, {
// observable
isLoading: observable.ref,

View File

@@ -6,7 +6,7 @@ import { IUserStore, UserStore } from "./user.store";
enableStaticRendering(typeof window === "undefined");
export class RootStore {
export abstract class CoreRootStore {
theme: IThemeStore;
instance: IInstanceStore;
user: IUserStore;

View File

@@ -1,6 +1,6 @@
import { action, observable, makeObservable } from "mobx";
// root store
import { RootStore } from "@/store/root.store";
import { CoreRootStore } from "@/store/root.store";
type TTheme = "dark" | "light";
export interface IThemeStore {
@@ -21,7 +21,7 @@ export class ThemeStore implements IThemeStore {
isSidebarCollapsed: boolean | undefined = undefined;
theme: string | undefined = undefined;
constructor(private store: RootStore) {
constructor(private store: CoreRootStore) {
makeObservable(this, {
// observables
isNewUserPopup: observable.ref,

View File

@@ -6,7 +6,7 @@ import { EUserStatus, TUserStatus } from "@/helpers/user.helper";
import { AuthService } from "@/services/auth.service";
import { UserService } from "@/services/user.service";
// root store
import { RootStore } from "@/store/root.store";
import { CoreRootStore } from "@/store/root.store";
export interface IUserStore {
// observables
@@ -31,7 +31,7 @@ export class UserStore implements IUserStore {
userService;
authService;
constructor(private store: RootStore) {
constructor(private store: CoreRootStore) {
makeObservable(this, {
// observables
isLoading: observable.ref,

View File

@@ -0,0 +1 @@
export * from "ce/components/common";

View File

@@ -0,0 +1 @@
export * from "ce/store/root.store";

View File

@@ -0,0 +1,14 @@
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
/**
* @description combine the file path with the base URL
* @param {string} path
* @returns {string} final URL with the base URL
*/
export const getFileURL = (path: string): string | undefined => {
if (!path) return undefined;
const isValidURL = path.startsWith("http");
if (isValidURL) return path;
return `${API_BASE_URL}${path}`;
};

View File

@@ -0,0 +1,21 @@
/**
* @description
* This function test whether a URL is valid or not.
*
* It accepts URLs with or without the protocol.
* @param {string} url
* @returns {boolean}
* @example
* checkURLValidity("https://example.com") => true
* checkURLValidity("example.com") => true
* checkURLValidity("example") => false
*/
export const checkURLValidity = (url: string): boolean => {
if (!url) return false;
// regex to support complex query parameters and fragments
const urlPattern =
/^(https?:\/\/)?((([a-z\d-]+\.)*[a-z\d-]+\.[a-z]{2,6})|(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}))(:\d+)?(\/[\w.-]*)*(\?[^#\s]*)?(#[\w-]*)?$/i;
return urlPattern.test(url);
};

2
admin/next-env.d.ts vendored
View File

@@ -2,4 +2,4 @@
/// <reference types="next/image-types/global" />
// NOTE: This file should not be edited
// see https://nextjs.org/docs/basic-features/typescript for more information.
// see https://nextjs.org/docs/app/building-your-application/configuring/typescript for more information.

View File

@@ -1,6 +1,6 @@
{
"name": "admin",
"version": "0.22.0",
"version": "0.23.1",
"private": true,
"scripts": {
"dev": "turbo run develop",
@@ -8,43 +8,44 @@
"build": "next build",
"preview": "next build && next start",
"start": "next start",
"lint": "next lint"
"lint": "eslint . --ext .ts,.tsx",
"lint:errors": "eslint . --ext .ts,.tsx --quiet"
},
"dependencies": {
"@headlessui/react": "^1.7.19",
"@plane/constants": "*",
"@plane/helpers": "*",
"@plane/types": "*",
"@plane/ui": "*",
"@plane/constants": "*",
"@sentry/nextjs": "^8.32.0",
"@tailwindcss/typography": "^0.5.9",
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
"axios": "^1.6.7",
"js-cookie": "^3.0.5",
"axios": "^1.7.4",
"lodash": "^4.17.21",
"lucide-react": "^0.356.0",
"mobx": "^6.12.0",
"mobx-react": "^9.1.1",
"next": "^14.2.3",
"next": "^14.2.12",
"next-themes": "^0.2.1",
"postcss": "^8.4.38",
"react": "^18.3.1",
"react-dom": "^18.3.1",
"react-hook-form": "^7.51.0",
"react-hook-form": "7.51.5",
"swr": "^2.2.4",
"tailwindcss": "3.3.2",
"uuid": "^9.0.1",
"zxcvbn": "^4.4.2"
},
"devDependencies": {
"@types/js-cookie": "^3.0.6",
"@plane/eslint-config": "*",
"@plane/typescript-config": "*",
"@types/node": "18.16.1",
"@types/react": "^18.2.48",
"@types/react": "^18.3.11",
"@types/react-dom": "^18.2.18",
"@types/uuid": "^9.0.8",
"@types/zxcvbn": "^4.4.4",
"eslint-config-custom": "*",
"tailwind-config-custom": "*",
"tsconfig": "*",
"typescript": "^5.4.2"
"typescript": "5.3.3"
}
}
}

View File

@@ -1,21 +1,15 @@
{
"extends": "tsconfig/nextjs.json",
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
"exclude": ["node_modules"],
"extends": "@plane/typescript-config/nextjs.json",
"compilerOptions": {
"plugins": [{ "name": "next" }],
"baseUrl": ".",
"jsx": "preserve",
"esModuleInterop": true,
"paths": {
"@/*": ["core/*"],
"@/helpers/*": ["helpers/*"],
"@/public/*": ["public/*"],
"@/plane-admin/*": ["ce/*"]
},
"plugins": [
{
"name": "next"
}
]
}
}
},
"include": ["next-env.d.ts", "next.config.js", "**/*.ts", "**/*.tsx", ".next/types/**/*.ts"],
"exclude": ["node_modules"]
}

View File

@@ -15,12 +15,18 @@ POSTGRES_DB="plane"
POSTGRES_PORT=5432
DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
# Redis Settings
REDIS_HOST="plane-redis"
REDIS_PORT="6379"
REDIS_URL="redis://${REDIS_HOST}:6379/"
# RabbitMQ Settings
RABBITMQ_HOST="plane-mq"
RABBITMQ_PORT="5672"
RABBITMQ_USER="plane"
RABBITMQ_PASSWORD="plane"
RABBITMQ_VHOST="plane"
# AWS Settings
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
@@ -51,5 +57,6 @@ ADMIN_BASE_URL=
SPACE_BASE_URL=
APP_BASE_URL=
# Hard delete files after days
HARD_DELETE_AFTER_DAYS=
HARD_DELETE_AFTER_DAYS=60

View File

@@ -1,4 +1,4 @@
FROM python:3.11.1-alpine3.17 AS backend
FROM python:3.12.5-alpine AS backend
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
@@ -7,23 +7,23 @@ ENV PIP_DISABLE_PIP_VERSION_CHECK=1
WORKDIR /code
RUN apk --no-cache add \
"libpq~=15" \
"libxslt~=1.1" \
"nodejs-current~=19" \
"xmlsec~=1.2"
RUN apk add --no-cache \
"libpq" \
"libxslt" \
"nodejs-current" \
"xmlsec"
COPY requirements.txt ./
COPY requirements ./requirements
RUN apk add --no-cache libffi-dev
RUN apk add --no-cache --virtual .build-deps \
"bash~=5.2" \
"g++~=12.2" \
"gcc~=12.2" \
"cargo~=1.64" \
"git~=2" \
"make~=4.3" \
"postgresql13-dev~=13" \
"g++" \
"gcc" \
"cargo" \
"git" \
"make" \
"postgresql-dev" \
"libc-dev" \
"linux-headers" \
&& \

View File

@@ -1,4 +1,4 @@
FROM python:3.11.1-alpine3.17 AS backend
FROM python:3.12.5-alpine AS backend
# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
@@ -7,18 +7,18 @@ ENV PIP_DISABLE_PIP_VERSION_CHECK=1
RUN apk --no-cache add \
"bash~=5.2" \
"libpq~=15" \
"libxslt~=1.1" \
"nodejs-current~=19" \
"xmlsec~=1.2" \
"libpq" \
"libxslt" \
"nodejs-current" \
"xmlsec" \
"libffi-dev" \
"bash~=5.2" \
"g++~=12.2" \
"gcc~=12.2" \
"cargo~=1.64" \
"git~=2" \
"make~=4.3" \
"postgresql13-dev~=13" \
"g++" \
"gcc" \
"cargo" \
"git" \
"make" \
"postgresql-dev" \
"libc-dev" \
"linux-headers"

View File

@@ -32,4 +32,3 @@ python manage.py create_bucket
python manage.py clear_cache
python manage.py runserver 0.0.0.0:8000 --settings=plane.settings.local

View File

@@ -1,4 +1,4 @@
{
"name": "plane-api",
"version": "0.22.0"
"version": "0.23.1"
}

View File

@@ -40,3 +40,44 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
request.META["X-RateLimit-Reset"] = reset_time
return allowed
class ServiceTokenRateThrottle(SimpleRateThrottle):
scope = "service_token"
rate = "300/minute"
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get("X-Api-Key")
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f"{self.scope}:{api_key}"
def allow_request(self, request, view):
allowed = super().allow_request(request, view)
if allowed:
now = self.timer()
# Calculate the remaining limit and reset time
history = self.cache.get(self.key, [])
# Remove old histories
while history and history[-1] <= now - self.duration:
history.pop()
# Calculate the requests
num_requests = len(history)
# Check available requests
available = self.num_requests - num_requests
# Unix timestamp for when the rate limit will reset
reset_time = int(now + self.duration)
# Add headers
request.META["X-RateLimit-Remaining"] = max(0, available)
request.META["X-RateLimit-Reset"] = reset_time
return allowed

View File

@@ -5,11 +5,11 @@ from .issue import (
IssueSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueAttachmentSerializer,
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
IssueExpandSerializer,
IssueLiteSerializer,
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer

View File

@@ -67,6 +67,7 @@ class BaseSerializer(serializers.ModelSerializer):
# Import all the expandable serializers
from . import (
IssueSerializer,
IssueLiteSerializer,
ProjectLiteSerializer,
StateLiteSerializer,
UserLiteSerializer,
@@ -86,6 +87,7 @@ class BaseSerializer(serializers.ModelSerializer):
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"parent": IssueLiteSerializer,
}
# Check if field in expansion then expand the field
if expand in expansion:

View File

@@ -1,6 +1,3 @@
from django.core.exceptions import ValidationError
from django.core.validators import URLValidator
# Django imports
from django.utils import timezone
from lxml import html
@@ -11,9 +8,10 @@ from rest_framework import serializers
# Module imports
from plane.db.models import (
Issue,
IssueType,
IssueActivity,
IssueAssignee,
IssueAttachment,
FileAsset,
IssueComment,
IssueLabel,
IssueLink,
@@ -29,6 +27,10 @@ from .module import ModuleLiteSerializer, ModuleSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer
# Django imports
from django.core.exceptions import ValidationError
from django.core.validators import URLValidator
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
@@ -46,6 +48,12 @@ class IssueSerializer(BaseSerializer):
write_only=True,
required=False,
)
type_id = serializers.PrimaryKeyRelatedField(
source="type",
queryset=IssueType.objects.all(),
required=False,
allow_null=True,
)
class Meta:
model = Issue
@@ -129,9 +137,19 @@ class IssueSerializer(BaseSerializer):
workspace_id = self.context["workspace_id"]
default_assignee_id = self.context["default_assignee_id"]
issue_type = validated_data.pop("type", None)
if not issue_type:
# Get default issue type
issue_type = IssueType.objects.filter(
project_issue_types__project_id=project_id, is_default=True
).first()
issue_type = issue_type
issue = Issue.objects.create(
**validated_data,
project_id=project_id,
type=issue_type,
)
# Issue Audit Users
@@ -257,6 +275,17 @@ class IssueSerializer(BaseSerializer):
return data
class IssueLiteSerializer(BaseSerializer):
class Meta:
model = Issue
fields = [
"id",
"sequence_id",
"project_id",
]
read_only_fields = fields
class LabelSerializer(BaseSerializer):
class Meta:
model = Label
@@ -331,7 +360,7 @@ class IssueLinkSerializer(BaseSerializer):
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
model = IssueAttachment
model = FileAsset
fields = "__all__"
read_only_fields = [
"id",

View File

@@ -71,6 +71,16 @@ class ModuleSerializer(BaseSerializer):
project_id = self.context["project_id"]
workspace_id = self.context["workspace_id"]
module_name = validated_data.get("name")
if module_name:
# Lookup for the module name in the module table for that project
if Module.objects.filter(
name=module_name, project_id=project_id
).exists():
raise serializers.ValidationError(
{"error": "Module with this name already exists"}
)
module = Module.objects.create(**validated_data, project_id=project_id)
if members is not None:
ModuleMember.objects.bulk_create(
@@ -93,6 +103,19 @@ class ModuleSerializer(BaseSerializer):
def update(self, instance, validated_data):
members = validated_data.pop("members", None)
module_name = validated_data.get("name")
if module_name:
# Lookup for the module name in the module table for that project
if (
Module.objects.filter(
name=module_name, project=instance.project
)
.exclude(id=instance.id)
.exists()
):
raise serializers.ValidationError(
{"error": "Module with this name already exists"}
)
if members is not None:
ModuleMember.objects.filter(module=instance).delete()

View File

@@ -19,6 +19,7 @@ class ProjectSerializer(BaseSerializer):
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
cover_image_url = serializers.CharField(read_only=True)
class Meta:
model = Project
@@ -32,6 +33,7 @@ class ProjectSerializer(BaseSerializer):
"created_by",
"updated_by",
"deleted_at",
"cover_image_url",
]
def validate(self, data):
@@ -87,6 +89,8 @@ class ProjectSerializer(BaseSerializer):
class ProjectLiteSerializer(BaseSerializer):
cover_image_url = serializers.CharField(read_only=True)
class Meta:
model = Project
fields = [
@@ -97,5 +101,6 @@ class ProjectLiteSerializer(BaseSerializer):
"icon_prop",
"emoji",
"description",
"cover_image_url",
]
read_only_fields = fields

View File

@@ -13,6 +13,7 @@ class UserLiteSerializer(BaseSerializer):
"last_name",
"email",
"avatar",
"avatar_url",
"display_name",
"email",
]

View File

@@ -7,6 +7,7 @@ from django.core.exceptions import ObjectDoesNotExist, ValidationError
from django.db import IntegrityError
from django.urls import resolve
from django.utils import timezone
from plane.db.models.api import APIToken
from rest_framework import status
from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
@@ -16,7 +17,7 @@ from rest_framework.views import APIView
# Module imports
from plane.api.middleware.api_authentication import APIKeyAuthentication
from plane.api.rate_limit import ApiKeyRateThrottle
from plane.api.rate_limit import ApiKeyRateThrottle, ServiceTokenRateThrottle
from plane.utils.exception_logger import log_exception
from plane.utils.paginator import BasePaginator
@@ -44,15 +45,29 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
IsAuthenticated,
]
throttle_classes = [
ApiKeyRateThrottle,
]
def filter_queryset(self, queryset):
for backend in list(self.filter_backends):
queryset = backend().filter_queryset(self.request, queryset, self)
return queryset
def get_throttles(self):
throttle_classes = []
api_key = self.request.headers.get("X-Api-Key")
if api_key:
service_token = APIToken.objects.filter(
token=api_key,
is_service=True,
).first()
if service_token:
throttle_classes.append(ServiceTokenRateThrottle())
return throttle_classes
throttle_classes.append(ApiKeyRateThrottle())
return throttle_classes
def handle_exception(self, exc):
"""
Handle any exception that occurs, by returning an appropriate response,
@@ -152,4 +167,4 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
for expand in self.request.GET.get("expand", "").split(",")
if expand
]
return expand if expand else None
return expand if expand else None

View File

@@ -13,8 +13,12 @@ from django.db.models import (
Q,
Sum,
FloatField,
Case,
When,
Value,
)
from django.db.models.functions import Cast
from django.db.models.functions import Cast, Concat
from django.db import models
# Third party imports
from rest_framework import status
@@ -26,13 +30,13 @@ from plane.api.serializers import (
CycleSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Cycle,
CycleIssue,
Issue,
Project,
IssueAttachment,
FileAsset,
IssueLink,
ProjectMember,
UserFavorite,
@@ -207,8 +211,7 @@ class CycleAPIEndpoint(BaseAPIView):
# Incomplete Cycles
if cycle_view == "incomplete":
queryset = queryset.filter(
Q(end_date__gte=timezone.now().date())
| Q(end_date__isnull=True),
Q(end_date__gte=timezone.now()) | Q(end_date__isnull=True),
)
return self.paginate(
request=request,
@@ -309,10 +312,7 @@ class CycleAPIEndpoint(BaseAPIView):
request_data = request.data
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if cycle.end_date is not None and cycle.end_date < timezone.now():
if "sort_order" in request_data:
# Can only change sort order
request_data = {
@@ -405,10 +405,6 @@ class CycleAPIEndpoint(BaseAPIView):
)
# Delete the cycle
cycle.delete()
# Delete the cycle issues
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk"),
).delete()
# Delete the user favorite cycle
UserFavorite.objects.filter(
entity_type="cycle",
@@ -537,13 +533,19 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
if cycle.end_date >= timezone.now().date():
if cycle.end_date >= timezone.now():
return Response(
{"error": "Only completed cycles can be archived"},
status=status.HTTP_400_BAD_REQUEST,
)
cycle.archived_at = timezone.now()
cycle.save()
UserFavorite.objects.filter(
entity_type="cycle",
entity_identifier=cycle_id,
project_id=project_id,
workspace__slug=slug,
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id, cycle_id):
@@ -639,8 +641,9 @@ class CycleIssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -881,7 +884,27 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar", "avatar_url")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
@@ -918,7 +941,8 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
if item["assignee_id"]
else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar", None),
"avatar_url": item.get("avatar_url", None),
"total_estimates": item["total_estimates"],
"completed_estimates": item["completed_estimates"],
"pending_estimates": item["pending_estimates"],
@@ -996,7 +1020,27 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_issues=Count(
"id",
@@ -1035,7 +1079,8 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
"assignee_id": (
str(item["assignee_id"]) if item["assignee_id"] else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar", None),
"avatar_url": item.get("avatar_url", None),
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
@@ -1140,7 +1185,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
if (
new_cycle.end_date is not None
and new_cycle.end_date < timezone.now().date()
and new_cycle.end_date < timezone.now()
):
return Response(
{

View File

@@ -1,7 +1,7 @@
# Python imports
import json
# Django improts
# Django imports
from django.core.serializers.json import DjangoJSONEncoder
from django.utils import timezone
from django.db.models import Q, Value, UUIDField
@@ -16,7 +16,7 @@ from rest_framework.response import Response
# Module imports
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Inbox,
InboxIssue,
@@ -184,13 +184,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
# Inbox view
if inbox is None and not project.inbox_view:
if inbox is None:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
@@ -215,7 +210,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
if project_member.role <= 5 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
@@ -232,7 +227,10 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -240,7 +238,11 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -249,9 +251,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
workspace__slug=slug,
project_id=project_id,
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
# viewers and guests since only viewers and guests
# Only allow guests to edit name and description
if project_member.role <= 5:
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get(
@@ -291,7 +292,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
# Only project admins and members can edit inbox issue attributes
if project_member.role > 10:
if project_member.role > 15:
serializer = InboxIssueSerializer(
inbox_issue, data=request.data, partial=True
)

View File

@@ -16,6 +16,7 @@ from django.db.models import (
Q,
Value,
When,
Subquery,
)
from django.utils import timezone
@@ -38,16 +39,17 @@ from plane.app.permissions import (
ProjectLitePermission,
ProjectMemberPermission,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueActivity,
IssueAttachment,
FileAsset,
IssueComment,
IssueLink,
Label,
Project,
ProjectMember,
CycleIssue,
)
from .base import BaseAPIView
@@ -202,7 +204,13 @@ class IssueAPIEndpoint(BaseAPIView):
issue_queryset = (
self.get_queryset()
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -210,8 +218,9 @@ class IssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -355,6 +364,124 @@ class IssueAPIEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def put(self, request, slug, project_id):
# Get the entities required for putting the issue, external_id and
# external_source are must to identify the issue here
project = Project.objects.get(pk=project_id)
external_id = request.data.get("external_id")
external_source = request.data.get("external_source")
# If the external_id and source are present, we need to find the exact
# issue that needs to be updated with the provided external_id and
# external_source
if external_id and external_source:
try:
issue = Issue.objects.get(
project_id=project_id,
workspace__slug=slug,
external_id=external_id,
external_source=external_source,
)
# Get the current instance of the issue in order to track
# changes and dispatch the issue activity
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
# Get the requested data, encode it as django object and pass it
# to serializer to validation
requested_data = json.dumps(
self.request.data, cls=DjangoJSONEncoder
)
serializer = IssueSerializer(
issue,
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
},
partial=True,
)
if serializer.is_valid():
# If the serializer is valid, save the issue and dispatch
# the update issue activity worker event.
serializer.save()
issue_activity.delay(
type="issue.activity.updated",
requested_data=requested_data,
actor_id=str(request.user.id),
issue_id=str(issue.id),
project_id=str(project_id),
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(
# If the serializer is not valid, respond with 400 bad
# request
serializer.errors,
status=status.HTTP_400_BAD_REQUEST,
)
except Issue.DoesNotExist:
# If the issue does not exist, a new record needs to be created
# for the requested data.
# Serialize the data with the context of the project and
# workspace
serializer = IssueSerializer(
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
"default_assignee_id": project.default_assignee_id,
},
)
# If the serializer is valid, save the issue and dispatch the
# issue activity worker event as created
if serializer.is_valid():
serializer.save()
# Refetch the issue
issue = Issue.objects.filter(
workspace__slug=slug,
project_id=project_id,
pk=serializer.data["id"],
).first()
# If any of the created_at or created_by is present, update
# the issue with the provided data, else return with the
# default states given.
issue.created_at = request.data.get(
"created_at", timezone.now()
)
issue.created_by_id = request.data.get(
"created_by", request.user.id
)
issue.save(update_fields=["created_at", "created_by"])
issue_activity.delay(
type="issue.activity.created",
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else:
return Response(
{"error": "external_id and external_source are required"},
status=status.HTTP_400_BAD_REQUEST,
)
def patch(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
@@ -944,7 +1071,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
model = IssueAttachment
model = FileAsset
parser_classes = (MultiPartParser, FormParser)
def post(self, request, slug, project_id, issue_id):
@@ -952,7 +1079,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
if (
request.data.get("external_id")
and request.data.get("external_source")
and IssueAttachment.objects.filter(
and FileAsset.objects.filter(
project_id=project_id,
workspace__slug=slug,
issue_id=issue_id,
@@ -960,7 +1087,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
external_id=request.data.get("external_id"),
).exists()
):
issue_attachment = IssueAttachment.objects.filter(
issue_attachment = FileAsset.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
@@ -994,7 +1121,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = IssueAttachment.objects.get(pk=pk)
issue_attachment = FileAsset.objects.get(pk=pk)
issue_attachment.asset.delete(save=False)
issue_attachment.delete()
issue_activity.delay(
@@ -1012,7 +1139,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, project_id, issue_id):
issue_attachments = IssueAttachment.objects.filter(
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachments, many=True)

View File

@@ -133,7 +133,7 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
workspace_member = WorkspaceMember.objects.create(
workspace=workspace,
member=user,
role=request.data.get("role", 10),
role=request.data.get("role", 5),
)
workspace_member.save()
@@ -142,7 +142,7 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
project_member = ProjectMember.objects.create(
project=project,
member=user,
role=request.data.get("role", 10),
role=request.data.get("role", 5),
)
project_member.save()

View File

@@ -18,10 +18,10 @@ from plane.api.serializers import (
ModuleSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
Module,
ModuleIssue,
@@ -298,7 +298,11 @@ class ModuleAPIEndpoint(BaseAPIView):
actor_id=str(request.user.id),
issue_id=None,
project_id=str(project_id),
current_instance=None,
current_instance=json.dumps(
{
"module_name": str(module.name),
}
),
epoch=int(timezone.now().timestamp()),
)
module.delete()
@@ -389,8 +393,9 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -520,7 +525,6 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@@ -635,6 +639,12 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
)
module.archived_at = timezone.now()
module.save()
UserFavorite.objects.filter(
entity_type="module",
entity_identifier=pk,
project_id=project_id,
workspace__slug=slug,
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id, pk):

View File

@@ -301,11 +301,16 @@ class ProjectAPIEndpoint(BaseAPIView):
if serializer.is_valid():
serializer.save()
if serializer.data["inbox_view"]:
Inbox.objects.get_or_create(
name=f"{project.name} Inbox",
inbox = Inbox.objects.filter(
project=project,
is_default=True,
)
).first()
if not inbox:
Inbox.objects.create(
name=f"{project.name} Inbox",
project=project,
is_default=True,
)
# Create the triage state in Backlog group
State.objects.get_or_create(
@@ -377,6 +382,10 @@ class ProjectArchiveUnarchiveAPIEndpoint(BaseAPIView):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.archived_at = timezone.now()
project.save()
UserFavorite.objects.filter(
workspace__slug=slug,
project=project_id,
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id):

View File

@@ -12,3 +12,4 @@ from .project import (
ProjectMemberPermission,
ProjectLitePermission,
)
from .base import allow_permission, ROLE

View File

@@ -0,0 +1,60 @@
from plane.db.models import WorkspaceMember, ProjectMember
from functools import wraps
from rest_framework.response import Response
from rest_framework import status
from enum import Enum
class ROLE(Enum):
ADMIN = 20
MEMBER = 15
GUEST = 5
def allow_permission(allowed_roles, level="PROJECT", creator=False, model=None):
def decorator(view_func):
@wraps(view_func)
def _wrapped_view(instance, request, *args, **kwargs):
# Check for creator if required
if creator and model:
obj = model.objects.filter(
id=kwargs["pk"], created_by=request.user
).exists()
if obj:
return view_func(instance, request, *args, **kwargs)
# Convert allowed_roles to their values if they are enum members
allowed_role_values = [
role.value if isinstance(role, ROLE) else role
for role in allowed_roles
]
# Check role permissions
if level == "WORKSPACE":
if WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=kwargs["slug"],
role__in=allowed_role_values,
is_active=True,
).exists():
return view_func(instance, request, *args, **kwargs)
else:
if ProjectMember.objects.filter(
member=request.user,
workspace__slug=kwargs["slug"],
project_id=kwargs["project_id"],
role__in=allowed_role_values,
is_active=True,
).exists():
return view_func(instance, request, *args, **kwargs)
# Return permission denied if no conditions are met
return Response(
{"error": "You don't have the required permissions."},
status=status.HTTP_403_FORBIDDEN,
)
return _wrapped_view
return decorator

View File

@@ -7,7 +7,6 @@ from plane.db.models import ProjectMember, WorkspaceMember
# Permission Mappings
Admin = 20
Member = 15
Viewer = 10
Guest = 5

View File

@@ -6,9 +6,8 @@ from plane.db.models import WorkspaceMember
# Permission Mappings
Owner = 20
Admin = 15
Member = 10
Admin = 20
Member = 15
Guest = 5
@@ -31,7 +30,7 @@ class WorkSpaceBasePermission(BasePermission):
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
role__in=[Admin, Member],
is_active=True,
).exists()
@@ -40,7 +39,7 @@ class WorkSpaceBasePermission(BasePermission):
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role=Owner,
role=Admin,
is_active=True,
).exists()
@@ -53,7 +52,7 @@ class WorkspaceOwnerPermission(BasePermission):
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Owner,
role=Admin,
).exists()
@@ -65,7 +64,7 @@ class WorkSpaceAdminPermission(BasePermission):
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
role__in=[Admin, Member],
is_active=True,
).exists()
@@ -86,7 +85,7 @@ class WorkspaceEntityPermission(BasePermission):
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
role__in=[Owner, Admin],
role__in=[Admin, Member],
is_active=True,
).exists()

View File

@@ -92,6 +92,7 @@ from .page import (
SubPageSerializer,
PageDetailSerializer,
PageVersionSerializer,
PageVersionDetailSerializer,
)
from .estimate import (
@@ -123,3 +124,9 @@ from .webhook import WebhookSerializer, WebhookLogSerializer
from .dashboard import DashboardSerializer, WidgetSerializer
from .favorite import UserFavoriteSerializer
from .draft import (
DraftIssueCreateSerializer,
DraftIssueSerializer,
DraftIssueDetailSerializer,
)

View File

@@ -49,48 +49,46 @@ class DynamicBaseSerializer(BaseSerializer):
allowed.append(list(item.keys())[0])
for field in allowed:
if field not in self.fields:
from . import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
LabelSerializer,
CycleIssueSerializer,
IssueLiteSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
)
from . import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
LabelSerializer,
CycleIssueSerializer,
IssueLiteSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IssueReactionLiteSerializer,
IssueLinkLiteSerializer,
)
# Expansion mapper
expansion = {
"user": UserLiteSerializer,
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"assignees": UserLiteSerializer,
"labels": LabelSerializer,
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
"sub_issues": IssueLiteSerializer,
}
# Expansion mapper
expansion = {
"user": UserLiteSerializer,
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"assignees": UserLiteSerializer,
"labels": LabelSerializer,
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
"sub_issues": IssueLiteSerializer,
}
if field not in self.fields and field in expansion:
self.fields[field] = expansion[field](
many=(
True
@@ -178,4 +176,29 @@ class DynamicBaseSerializer(BaseSerializer):
instance, f"{expand}_id", None
)
# Check if issue_attachments is in fields or expand
if (
"issue_attachments" in self.fields
or "issue_attachments" in self.expand
):
# Import the model here to avoid circular imports
from plane.db.models import FileAsset
issue_id = getattr(instance, "id", None)
if issue_id:
# Fetch related issue_attachments
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
# Serialize issue_attachments and add them to the response
response["issue_attachments"] = (
IssueAttachmentLiteSerializer(
issue_attachments, many=True
).data
)
else:
response["issue_attachments"] = []
return response

View File

@@ -0,0 +1,292 @@
# Django imports
from django.utils import timezone
# Third Party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from plane.db.models import (
User,
Issue,
Label,
State,
DraftIssue,
DraftIssueAssignee,
DraftIssueLabel,
DraftIssueCycle,
DraftIssueModule,
)
class DraftIssueCreateSerializer(BaseSerializer):
# ids
state_id = serializers.PrimaryKeyRelatedField(
source="state",
queryset=State.objects.all(),
required=False,
allow_null=True,
)
parent_id = serializers.PrimaryKeyRelatedField(
source="parent",
queryset=Issue.objects.all(),
required=False,
allow_null=True,
)
label_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = DraftIssue
fields = "__all__"
read_only_fields = [
"workspace",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
assignee_ids = self.initial_data.get("assignee_ids")
data["assignee_ids"] = assignee_ids if assignee_ids else []
label_ids = self.initial_data.get("label_ids")
data["label_ids"] = label_ids if label_ids else []
return data
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
return data
def create(self, validated_data):
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
modules = validated_data.pop("module_ids", None)
cycle_id = self.initial_data.get("cycle_id", None)
modules = self.initial_data.get("module_ids", None)
workspace_id = self.context["workspace_id"]
project_id = self.context["project_id"]
# Create Issue
issue = DraftIssue.objects.create(
**validated_data,
workspace_id=workspace_id,
project_id=project_id,
)
# Issue Audit Users
created_by_id = issue.created_by_id
updated_by_id = issue.updated_by_id
if assignees is not None and len(assignees):
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
draft_issue=issue,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None and len(labels):
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
if cycle_id is not None:
DraftIssueCycle.objects.create(
cycle_id=cycle_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if modules is not None and len(modules):
DraftIssueModule.objects.bulk_create(
[
DraftIssueModule(
module_id=module_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for module_id in modules
],
batch_size=10,
)
return issue
def update(self, instance, validated_data):
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
cycle_id = self.context.get("cycle_id", None)
modules = self.initial_data.get("module_ids", None)
# Related models
workspace_id = instance.workspace_id
project_id = instance.project_id
created_by_id = instance.created_by_id
updated_by_id = instance.updated_by_id
if assignees is not None:
DraftIssueAssignee.objects.filter(draft_issue=instance).delete()
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None:
DraftIssueLabel.objects.filter(draft_issue=instance).delete()
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
if cycle_id != "not_provided":
DraftIssueCycle.objects.filter(draft_issue=instance).delete()
if cycle_id:
DraftIssueCycle.objects.create(
cycle_id=cycle_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if modules is not None:
DraftIssueModule.objects.filter(draft_issue=instance).delete()
DraftIssueModule.objects.bulk_create(
[
DraftIssueModule(
module_id=module_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for module_id in modules
],
batch_size=10,
)
# Time updation occurs even when other related models are updated
instance.updated_at = timezone.now()
return super().update(instance, validated_data)
class DraftIssueSerializer(BaseSerializer):
# ids
cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
module_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
# Many to many
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
class Meta:
model = DraftIssue
fields = [
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"created_at",
"updated_at",
"created_by",
"updated_by",
"type_id",
"description_html",
]
read_only_fields = fields
class DraftIssueDetailSerializer(DraftIssueSerializer):
description_html = serializers.CharField()
class Meta(DraftIssueSerializer.Meta):
fields = DraftIssueSerializer.Meta.fields + [
"description_html",
]
read_only_fields = fields

View File

@@ -27,7 +27,7 @@ from plane.db.models import (
Module,
ModuleIssue,
IssueLink,
IssueAttachment,
FileAsset,
IssueReaction,
CommentReaction,
IssueVote,
@@ -437,17 +437,21 @@ class IssueLinkSerializer(BaseSerializer):
"issue",
]
def validate_url(self, value):
# Check URL format
validate_url = URLValidator()
try:
validate_url(value)
except ValidationError:
raise serializers.ValidationError("Invalid URL format.")
def to_internal_value(self, data):
# Modify the URL before validation by appending http:// if missing
url = data.get("url", "")
if url and not url.startswith(("http://", "https://")):
data["url"] = "http://" + url
# Check URL scheme
if not value.startswith(("http://", "https://")):
raise serializers.ValidationError("Invalid URL scheme.")
return super().to_internal_value(data)
def validate_url(self, value):
# Use Django's built-in URLValidator for validation
url_validator = URLValidator()
try:
url_validator(value)
except ValidationError:
raise serializers.ValidationError({"error": "Invalid URL format."})
return value
@@ -494,8 +498,11 @@ class IssueLinkLiteSerializer(BaseSerializer):
class IssueAttachmentSerializer(BaseSerializer):
asset_url = serializers.CharField(read_only=True)
class Meta:
model = IssueAttachment
model = FileAsset
fields = "__all__"
read_only_fields = [
"created_by",
@@ -510,14 +517,15 @@ class IssueAttachmentSerializer(BaseSerializer):
class IssueAttachmentLiteSerializer(DynamicBaseSerializer):
class Meta:
model = IssueAttachment
model = FileAsset
fields = [
"id",
"asset",
"attributes",
"issue_id",
# "issue_id",
"updated_at",
"updated_by",
"asset_url",
]
read_only_fields = fields
@@ -533,7 +541,7 @@ class IssueReactionSerializer(BaseSerializer):
"project",
"issue",
"actor",
"deleted_at"
"deleted_at",
]
@@ -552,7 +560,13 @@ class CommentReactionSerializer(BaseSerializer):
class Meta:
model = CommentReaction
fields = "__all__"
read_only_fields = ["workspace", "project", "comment", "actor", "deleted_at"]
read_only_fields = [
"workspace",
"project",
"comment",
"actor",
"deleted_at",
]
class IssueVoteSerializer(BaseSerializer):

View File

@@ -5,6 +5,10 @@ from rest_framework import serializers
from .base import BaseSerializer, DynamicBaseSerializer
from .project import ProjectLiteSerializer
# Django imports
from django.core.validators import URLValidator
from django.core.exceptions import ValidationError
from plane.db.models import (
User,
Module,
@@ -64,6 +68,16 @@ class ModuleWriteSerializer(BaseSerializer):
members = validated_data.pop("member_ids", None)
project = self.context["project"]
module_name = validated_data.get("name")
if module_name:
# Lookup for the module name in the module table for that project
if Module.objects.filter(
name=module_name, project=project
).exists():
raise serializers.ValidationError(
{"error": "Module with this name already exists"}
)
module = Module.objects.create(**validated_data, project=project)
if members is not None:
ModuleMember.objects.bulk_create(
@@ -86,6 +100,19 @@ class ModuleWriteSerializer(BaseSerializer):
def update(self, instance, validated_data):
members = validated_data.pop("member_ids", None)
module_name = validated_data.get("name")
if module_name:
# Lookup for the module name in the module table for that project
if (
Module.objects.filter(
name=module_name, project=instance.project
)
.exclude(id=instance.id)
.exists()
):
raise serializers.ValidationError(
{"error": "Module with this name already exists"}
)
if members is not None:
ModuleMember.objects.filter(module=instance).delete()
@@ -155,16 +182,48 @@ class ModuleLinkSerializer(BaseSerializer):
"module",
]
# Validation if url already exists
def to_internal_value(self, data):
# Modify the URL before validation by appending http:// if missing
url = data.get("url", "")
if url and not url.startswith(("http://", "https://")):
data["url"] = "http://" + url
return super().to_internal_value(data)
def validate_url(self, value):
# Use Django's built-in URLValidator for validation
url_validator = URLValidator()
try:
url_validator(value)
except ValidationError:
raise serializers.ValidationError({"error": "Invalid URL format."})
return value
def create(self, validated_data):
validated_data["url"] = self.validate_url(validated_data.get("url"))
if ModuleLink.objects.filter(
url=validated_data.get("url"),
module_id=validated_data.get("module_id"),
).exists():
raise serializers.ValidationError({"error": "URL already exists."})
return super().create(validated_data)
def update(self, instance, validated_data):
validated_data["url"] = self.validate_url(validated_data.get("url"))
if (
ModuleLink.objects.filter(
url=validated_data.get("url"),
module_id=instance.module_id,
)
.exclude(pk=instance.id)
.exists()
):
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
return ModuleLink.objects.create(**validated_data)
return super().update(instance, validated_data)
class ModuleSerializer(DynamicBaseSerializer):
@@ -229,7 +288,14 @@ class ModuleDetailSerializer(ModuleSerializer):
cancelled_estimate_points = serializers.FloatField(read_only=True)
class Meta(ModuleSerializer.Meta):
fields = ModuleSerializer.Meta.fields + ["link_module", "sub_issues", "backlog_estimate_points", "unstarted_estimate_points", "started_estimate_points", "cancelled_estimate_points"]
fields = ModuleSerializer.Meta.fields + [
"link_module",
"sub_issues",
"backlog_estimate_points",
"unstarted_estimate_points",
"started_estimate_points",
"cancelled_estimate_points",
]
class ModuleUserPropertiesSerializer(BaseSerializer):

View File

@@ -167,7 +167,40 @@ class PageLogSerializer(BaseSerializer):
class PageVersionSerializer(BaseSerializer):
class Meta:
model = PageVersion
fields = "__all__"
fields = [
"id",
"workspace",
"page",
"last_saved_at",
"owned_by",
"created_at",
"updated_at",
"created_by",
"updated_by",
]
read_only_fields = [
"workspace",
"page",
]
class PageVersionDetailSerializer(BaseSerializer):
class Meta:
model = PageVersion
fields = [
"id",
"workspace",
"page",
"last_saved_at",
"description_binary",
"description_html",
"description_json",
"owned_by",
"created_at",
"updated_at",
"created_by",
"updated_by",
]
read_only_fields = [
"workspace",
"page",

View File

@@ -95,6 +95,7 @@ class ProjectLiteSerializer(BaseSerializer):
"identifier",
"name",
"cover_image",
"cover_image_url",
"logo_props",
"description",
]
@@ -117,6 +118,7 @@ class ProjectListSerializer(DynamicBaseSerializer):
member_role = serializers.IntegerField(read_only=True)
anchor = serializers.CharField(read_only=True)
members = serializers.SerializerMethodField()
cover_image_url = serializers.CharField(read_only=True)
def get_members(self, obj):
project_members = getattr(obj, "members_list", None)
@@ -128,6 +130,7 @@ class ProjectListSerializer(DynamicBaseSerializer):
"member_id": member.member_id,
"member__display_name": member.member.display_name,
"member__avatar": member.member.avatar,
"member__avatar_url": member.member.avatar_url,
}
for member in project_members
]

View File

@@ -16,26 +16,39 @@ from .base import BaseSerializer
class UserSerializer(BaseSerializer):
class Meta:
model = User
fields = "__all__"
# Exclude password field from the serializer
fields = [
field.name
for field in User._meta.fields
if field.name != "password"
]
# Make all system fields and email read only
read_only_fields = [
"id",
"username",
"mobile_number",
"email",
"token",
"created_at",
"updated_at",
"is_superuser",
"is_staff",
"is_managed",
"last_active",
"last_login_time",
"last_logout_time",
"last_login_ip",
"last_logout_ip",
"last_login_uagent",
"token_updated_at",
"last_location",
"last_login_medium",
"created_location",
"is_bot",
"is_password_autoset",
"is_email_verified",
"is_active",
"token_updated_at",
]
extra_kwargs = {"password": {"write_only": True}}
# If the user has already filled first name or last name then he is onboarded
def get_is_onboarded(self, obj):
@@ -43,12 +56,15 @@ class UserSerializer(BaseSerializer):
class UserMeSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"avatar",
"cover_image",
"avatar_url",
"cover_image_url",
"date_joined",
"display_name",
"email",
@@ -143,6 +159,7 @@ class UserLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"avatar_url",
"is_bot",
"display_name",
]
@@ -160,9 +177,11 @@ class UserAdminLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"avatar_url",
"is_bot",
"display_name",
"email",
"last_login_medium",
]
read_only_fields = [
"id",
@@ -208,9 +227,15 @@ class ProfileSerializer(BaseSerializer):
class Meta:
model = Profile
fields = "__all__"
read_only_fields = [
"user",
]
class AccountSerializer(BaseSerializer):
class Meta:
model = Account
fields = "__all__"
read_only_fields = [
"user",
]

View File

@@ -40,7 +40,7 @@ class WebhookSerializer(DynamicBaseSerializer):
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
if ip.is_loopback:
raise serializers.ValidationError(
{"url": "URL resolves to a blocked IP address."}
)
@@ -92,7 +92,7 @@ class WebhookSerializer(DynamicBaseSerializer):
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
if ip.is_loopback:
raise serializers.ValidationError(
{"url": "URL resolves to a blocked IP address."}
)

View File

@@ -22,6 +22,7 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
logo_url = serializers.CharField(read_only=True)
def validate_slug(self, value):
# Check if the slug is restricted
@@ -39,6 +40,7 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
"created_at",
"updated_at",
"owner",
"logo_url",
]
@@ -63,6 +65,7 @@ class WorkSpaceMemberSerializer(DynamicBaseSerializer):
class WorkspaceMemberMeSerializer(BaseSerializer):
draft_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"

View File

@@ -5,6 +5,13 @@ from plane.app.views import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
# V2 Endpoints
WorkspaceFileAssetEndpoint,
UserAssetsV2Endpoint,
StaticFileAssetEndpoint,
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
)
@@ -38,4 +45,49 @@ urlpatterns = [
),
name="file-assets-restore",
),
# V2 Endpoints
path(
"assets/v2/workspaces/<str:slug>/",
WorkspaceFileAssetEndpoint.as_view(),
name="workspace-file-assets",
),
path(
"assets/v2/workspaces/<str:slug>/<uuid:asset_id>/",
WorkspaceFileAssetEndpoint.as_view(),
name="workspace-file-assets",
),
path(
"assets/v2/user-assets/",
UserAssetsV2Endpoint.as_view(),
name="user-file-assets",
),
path(
"assets/v2/user-assets/<uuid:asset_id>/",
UserAssetsV2Endpoint.as_view(),
name="user-file-assets",
),
path(
"assets/v2/workspaces/<str:slug>/restore/<uuid:asset_id>/",
AssetRestoreEndpoint.as_view(),
name="asset-restore",
),
path(
"assets/v2/static/<uuid:asset_id>/",
StaticFileAssetEndpoint.as_view(),
name="static-file-asset",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:pk>/",
ProjectAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:entity_id>/bulk/",
ProjectBulkAssetEndpoint.as_view(),
),
]

View File

@@ -6,6 +6,8 @@ from plane.app.views import (
CycleIssueViewSet,
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
CycleProgressEndpoint,
CycleAnalyticsEndpoint,
TransferCycleIssueEndpoint,
CycleUserPropertiesEndpoint,
CycleArchiveUnarchiveEndpoint,
@@ -106,4 +108,14 @@ urlpatterns = [
CycleArchiveUnarchiveEndpoint.as_view(),
name="cycle-archive-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/progress/",
CycleProgressEndpoint.as_view(),
name="project-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/analytics/",
CycleAnalyticsEndpoint.as_view(),
name="project-cycle",
),
]

View File

@@ -40,7 +40,7 @@ urlpatterns = [
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:pk>/",
InboxIssueViewSet.as_view(
{
"get": "retrieve",

View File

@@ -11,7 +11,6 @@ from plane.app.views import (
IssueActivityEndpoint,
IssueArchiveViewSet,
IssueCommentViewSet,
IssueDraftViewSet,
IssueListEndpoint,
IssueReactionViewSet,
IssueRelationViewSet,
@@ -19,8 +18,10 @@ from plane.app.views import (
IssueUserDisplayPropertyEndpoint,
IssueViewSet,
LabelViewSet,
BulkIssueOperationsEndpoint,
BulkArchiveIssuesEndpoint,
DeletedIssuesListViewSet,
IssuePaginatedViewSet,
IssueAttachmentV2Endpoint,
)
urlpatterns = [
@@ -39,6 +40,12 @@ urlpatterns = [
),
name="project-issue",
),
# updated v2 paginated issues
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/v2/issues/",
IssuePaginatedViewSet.as_view({"get": "list"}),
name="project-issues-paginated",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:pk>/",
IssueViewSet.as_view(
@@ -126,6 +133,18 @@ urlpatterns = [
IssueAttachmentEndpoint.as_view(),
name="project-issue-attachments",
),
# V2 Attachments
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/attachments/",
IssueAttachmentV2Endpoint.as_view(),
name="project-issue-attachments",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/attachments/<uuid:pk>/",
IssueAttachmentV2Endpoint.as_view(),
name="project-issue-attachments",
),
## Export Issues
path(
"workspaces/<str:slug>/export-issues/",
ExportIssuesEndpoint.as_view(),
@@ -283,31 +302,9 @@ urlpatterns = [
name="issue-relation",
),
## End Issue Relation
## Issue Drafts
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/",
IssueDraftViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-drafts/<uuid:pk>/",
IssueDraftViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="project-issue-draft",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/bulk-operation-issues/",
BulkIssueOperationsEndpoint.as_view(),
name="bulk-operations-issues",
"workspaces/<str:slug>/projects/<uuid:project_id>/deleted-issues/",
DeletedIssuesListViewSet.as_view(),
name="deleted-issues",
),
]

View File

@@ -27,6 +27,7 @@ from plane.app.views import (
WorkspaceCyclesEndpoint,
WorkspaceFavoriteEndpoint,
WorkspaceFavoriteGroupEndpoint,
WorkspaceDraftIssueViewSet,
)
@@ -254,4 +255,30 @@ urlpatterns = [
WorkspaceFavoriteGroupEndpoint.as_view(),
name="workspace-user-favorites-groups",
),
path(
"workspaces/<str:slug>/draft-issues/",
WorkspaceDraftIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="workspace-draft-issues",
),
path(
"workspaces/<str:slug>/draft-issues/<uuid:pk>/",
WorkspaceDraftIssueViewSet.as_view(
{
"get": "retrieve",
"patch": "partial_update",
"delete": "destroy",
}
),
name="workspace-drafts-issues",
),
path(
"workspaces/<str:slug>/draft-to-issue/<uuid:draft_id>/",
WorkspaceDraftIssueViewSet.as_view({"post": "create_draft_to_issue"}),
name="workspace-drafts-issues",
),
]

View File

@@ -40,6 +40,8 @@ from .workspace.base import (
ExportWorkspaceUserActivityEndpoint,
)
from .workspace.draft import WorkspaceDraftIssueViewSet
from .workspace.favorite import (
WorkspaceFavoriteEndpoint,
WorkspaceFavoriteGroupEndpoint,
@@ -98,6 +100,8 @@ from .cycle.base import (
CycleUserPropertiesEndpoint,
CycleViewSet,
TransferCycleIssueEndpoint,
CycleAnalyticsEndpoint,
CycleProgressEndpoint,
)
from .cycle.issue import (
CycleIssueViewSet,
@@ -106,12 +110,26 @@ from .cycle.archive import (
CycleArchiveUnarchiveEndpoint,
)
from .asset.base import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
from .asset.base import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
)
from .asset.v2 import (
WorkspaceFileAssetEndpoint,
UserAssetsV2Endpoint,
StaticFileAssetEndpoint,
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
)
from .issue.base import (
IssueListEndpoint,
IssueViewSet,
IssueUserDisplayPropertyEndpoint,
BulkDeleteIssuesEndpoint,
DeletedIssuesListViewSet,
IssuePaginatedViewSet,
)
from .issue.activity import (
@@ -122,6 +140,8 @@ from .issue.archive import IssueArchiveViewSet, BulkArchiveIssuesEndpoint
from .issue.attachment import (
IssueAttachmentEndpoint,
# V2
IssueAttachmentV2Endpoint,
)
from .issue.comment import (
@@ -129,8 +149,6 @@ from .issue.comment import (
CommentReactionViewSet,
)
from .issue.draft import IssueDraftViewSet
from .issue.label import (
LabelViewSet,
BulkCreateIssueLabelsEndpoint,
@@ -156,9 +174,6 @@ from .issue.subscriber import (
IssueSubscriberViewSet,
)
from .issue.bulk_operations import BulkIssueOperationsEndpoint
from .module.base import (
ModuleViewSet,
ModuleLinkViewSet,

View File

@@ -1,28 +1,35 @@
# Django imports
from django.db.models import Count, F, Sum
from django.db.models import Count, F, Sum, Q
from django.db.models.functions import ExtractMonth
from django.utils import timezone
from django.db.models.functions import Concat
from django.db.models import Case, When, Value
from django.db import models
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import WorkSpaceAdminPermission
from plane.app.serializers import AnalyticViewSerializer
# Module imports
from plane.app.views.base import BaseAPIView, BaseViewSet
from plane.bgtasks.analytic_plot_export import analytic_export_task
from plane.db.models import AnalyticView, Issue, Workspace
from plane.utils.analytics_plot import build_graph_plot
from plane.utils.issue_filters import issue_filters
from plane.app.permissions import allow_permission, ROLE
class AnalyticsEndpoint(BaseAPIView):
permission_classes = [
WorkSpaceAdminPermission,
]
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
],
level="WORKSPACE",
)
def get(self, request, slug):
x_axis = request.GET.get("x_axis", False)
y_axis = request.GET.get("y_axis", False)
@@ -103,7 +110,10 @@ class AnalyticsEndpoint(BaseAPIView):
if x_axis in ["labels__id"] or segment in ["labels__id"]:
label_details = (
Issue.objects.filter(
workspace__slug=slug, **filters, labels__id__isnull=False
workspace__slug=slug,
**filters,
labels__id__isnull=False
& Q(label_issue__deleted_at__isnull=True),
)
.distinct("labels__id")
.order_by("labels__id")
@@ -114,14 +124,37 @@ class AnalyticsEndpoint(BaseAPIView):
if x_axis in ["assignees__id"] or segment in ["assignees__id"]:
assignee_details = (
Issue.issue_objects.filter(
Q(
Q(assignees__avatar__isnull=False)
| Q(assignees__avatar_asset__isnull=False)
),
workspace__slug=slug,
**filters,
assignees__avatar__isnull=False,
)
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("assignees__id")
.distinct("assignees__id")
.values(
"assignees__avatar",
"assignees__avatar_url",
"assignees__display_name",
"assignees__first_name",
"assignees__last_name",
@@ -201,10 +234,14 @@ class AnalyticViewViewset(BaseViewSet):
class SavedAnalyticEndpoint(BaseAPIView):
permission_classes = [
WorkSpaceAdminPermission,
]
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
],
level="WORKSPACE",
)
def get(self, request, slug, analytic_id):
analytic_view = AnalyticView.objects.get(
pk=analytic_id, workspace__slug=slug
@@ -234,10 +271,14 @@ class SavedAnalyticEndpoint(BaseAPIView):
class ExportAnalyticsEndpoint(BaseAPIView):
permission_classes = [
WorkSpaceAdminPermission,
]
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
],
level="WORKSPACE",
)
def post(self, request, slug):
x_axis = request.data.get("x_axis", False)
y_axis = request.data.get("y_axis", False)
@@ -301,10 +342,8 @@ class ExportAnalyticsEndpoint(BaseAPIView):
class DefaultAnalyticsEndpoint(BaseAPIView):
permission_classes = [
WorkSpaceAdminPermission,
]
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE")
def get(self, request, slug):
filters = issue_filters(request.GET, "GET")
base_issues = Issue.issue_objects.filter(
@@ -345,7 +384,6 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
user_details = [
"created_by__first_name",
"created_by__last_name",
"created_by__avatar",
"created_by__display_name",
"created_by__id",
]
@@ -354,13 +392,32 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
base_issues.exclude(created_by=None)
.values(*user_details)
.annotate(count=Count("id"))
.annotate(
created_by__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
created_by__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"created_by__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
created_by__avatar_asset__isnull=True,
then="created_by__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("-count")[:5]
)
user_assignee_details = [
"assignees__first_name",
"assignees__last_name",
"assignees__avatar",
"assignees__display_name",
"assignees__id",
]
@@ -369,6 +426,26 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
base_issues.filter(completed_at__isnull=False)
.exclude(assignees=None)
.values(*user_assignee_details)
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.annotate(count=Count("id"))
.order_by("-count")[:5]
)
@@ -377,15 +454,33 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
base_issues.filter(completed_at__isnull=True)
.values(*user_assignee_details)
.annotate(count=Count("id"))
.annotate(
assignees__avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.order_by("-count")
)
open_estimate_sum = open_issues_queryset.aggregate(
sum=Sum("point")
)["sum"]
total_estimate_sum = base_issues.aggregate(sum=Sum("point"))[
open_estimate_sum = open_issues_queryset.aggregate(sum=Sum("point"))[
"sum"
]
total_estimate_sum = base_issues.aggregate(sum=Sum("point"))["sum"]
return Response(
{

View File

@@ -0,0 +1,803 @@
# Python imports
import uuid
# Django imports
from django.conf import settings
from django.http import HttpResponseRedirect
from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from rest_framework.permissions import AllowAny
# Module imports
from ..base import BaseAPIView
from plane.db.models import (
FileAsset,
Workspace,
Project,
User,
)
from plane.settings.storage import S3Storage
from plane.app.permissions import allow_permission, ROLE
from plane.utils.cache import invalidate_cache_directly
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
class UserAssetsV2Endpoint(BaseAPIView):
"""This endpoint is used to upload user profile images."""
def asset_delete(self, asset_id):
asset = FileAsset.objects.filter(id=asset_id).first()
if asset is None:
return
asset.is_deleted = True
asset.deleted_at = timezone.now()
asset.save()
return
def entity_asset_save(self, asset_id, entity_type, asset, request):
# User Avatar
if entity_type == FileAsset.EntityTypeContext.USER_AVATAR:
user = User.objects.get(id=asset.user_id)
user.avatar = ""
# Delete the previous avatar
if user.avatar_asset_id:
self.asset_delete(user.avatar_asset_id)
# Save the new avatar
user.avatar_asset_id = asset_id
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
# User Cover
if entity_type == FileAsset.EntityTypeContext.USER_COVER:
user = User.objects.get(id=asset.user_id)
user.cover_image = None
# Delete the previous cover image
if user.cover_image_asset_id:
self.asset_delete(user.cover_image_asset_id)
# Save the new cover image
user.cover_image_asset_id = asset_id
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
return
def entity_asset_delete(self, entity_type, asset, request):
# User Avatar
if entity_type == FileAsset.EntityTypeContext.USER_AVATAR:
user = User.objects.get(id=asset.user_id)
user.avatar_asset_id = None
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
# User Cover
if entity_type == FileAsset.EntityTypeContext.USER_COVER:
user = User.objects.get(id=asset.user_id)
user.cover_image_asset_id = None
user.save()
invalidate_cache_directly(
path="/api/users/me/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/settings/",
url_params=False,
user=True,
request=request,
)
return
return
def post(self, request):
# get the asset key
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type", False)
# Check if the file size is within the limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Check if the entity type is allowed
if not entity_type or entity_type not in ["USER_AVATAR", "USER_COVER"]:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# asset key
asset_key = f"{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
user=request.user,
created_by=request.user,
entity_type=entity_type,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
def patch(self, request, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
# get the entity and save the asset id for the request field
self.entity_asset_save(
asset_id=asset_id,
entity_type=asset.entity_type,
asset=asset,
request=request,
)
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, asset_id):
asset = FileAsset.objects.get(id=asset_id, user_id=request.user.id)
asset.is_deleted = True
asset.deleted_at = timezone.now()
# get the entity and save the asset id for the request field
self.entity_asset_delete(
entity_type=asset.entity_type, asset=asset, request=request
)
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class WorkspaceFileAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload cover images/logos etc for workspace, projects and users."""
def get_entity_id_field(self, entity_type, entity_id):
# Workspace Logo
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
return {
"workspace_id": entity_id,
}
# Project Cover
if entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
return {
"project_id": entity_id,
}
# User Avatar and Cover
if entity_type in [
FileAsset.EntityTypeContext.USER_AVATAR,
FileAsset.EntityTypeContext.USER_COVER,
]:
return {
"user_id": entity_id,
}
# Issue Attachment and Description
if entity_type in [
FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
FileAsset.EntityTypeContext.ISSUE_DESCRIPTION,
]:
return {
"issue_id": entity_id,
}
# Page Description
if entity_type == FileAsset.EntityTypeContext.PAGE_DESCRIPTION:
return {
"page_id": entity_id,
}
# Comment Description
if entity_type == FileAsset.EntityTypeContext.COMMENT_DESCRIPTION:
return {
"comment_id": entity_id,
}
return {}
def asset_delete(self, asset_id):
asset = FileAsset.objects.filter(id=asset_id).first()
# Check if the asset exists
if asset is None:
return
# Mark the asset as deleted
asset.is_deleted = True
asset.deleted_at = timezone.now()
asset.save()
return
def entity_asset_save(self, asset_id, entity_type, asset, request):
# Workspace Logo
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
workspace = Workspace.objects.filter(id=asset.workspace_id).first()
if workspace is None:
return
# Delete the previous logo
if workspace.logo_asset_id:
self.asset_delete(workspace.logo_asset_id)
# Save the new logo
workspace.logo = ""
workspace.logo_asset_id = asset_id
workspace.save()
invalidate_cache_directly(
path="/api/workspaces/",
url_params=False,
user=False,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/workspaces/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/instances/",
url_params=False,
user=False,
request=request,
)
return
# Project Cover
elif entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
project = Project.objects.filter(id=asset.workspace_id).first()
if project is None:
return
# Delete the previous cover image
if project.cover_image_asset_id:
self.asset_delete(project.cover_image_asset_id)
# Save the new cover image
project.cover_image = ""
project.cover_image_asset_id = asset_id
project.save()
return
else:
return
def entity_asset_delete(self, entity_type, asset, request):
# Workspace Logo
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
workspace = Workspace.objects.get(id=asset.workspace_id)
if workspace is None:
return
workspace.logo_asset_id = None
workspace.save()
invalidate_cache_directly(
path="/api/workspaces/",
url_params=False,
user=False,
request=request,
)
invalidate_cache_directly(
path="/api/users/me/workspaces/",
url_params=False,
user=True,
request=request,
)
invalidate_cache_directly(
path="/api/instances/",
url_params=False,
user=False,
request=request,
)
return
# Project Cover
elif entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
project = Project.objects.filter(id=asset.project_id).first()
if project is None:
return
project.cover_image_asset_id = None
project.save()
return
else:
return
def post(self, request, slug):
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type")
entity_identifier = request.data.get("entity_identifier", False)
# Check if the entity type is allowed
if entity_type not in FileAsset.EntityTypeContext.values:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the size limit
size_limit = min(settings.FILE_SIZE_LIMIT, size)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
workspace=workspace,
created_by=request.user,
entity_type=entity_type,
**self.get_entity_id_field(
entity_type=entity_type, entity_id=entity_identifier
),
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
def patch(self, request, slug, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id, workspace__slug=slug)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(asset_id))
# get the entity and save the asset id for the request field
self.entity_asset_save(
asset_id=asset_id,
entity_type=asset.entity_type,
asset=asset,
request=request,
)
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, asset_id):
asset = FileAsset.objects.get(id=asset_id, workspace__slug=slug)
asset.is_deleted = True
asset.deleted_at = timezone.now()
# get the entity and save the asset id for the request field
self.entity_asset_delete(
entity_type=asset.entity_type, asset=asset, request=request
)
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id, workspace__slug=slug)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
)
# Redirect to the signed URL
return HttpResponseRedirect(signed_url)
class StaticFileAssetEndpoint(BaseAPIView):
"""This endpoint is used to get the signed URL for a static asset."""
permission_classes = [
AllowAny,
]
def get(self, request, asset_id):
# get the asset id
asset = FileAsset.objects.get(id=asset_id)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Check if the entity type is allowed
if asset.entity_type not in [
FileAsset.EntityTypeContext.USER_AVATAR,
FileAsset.EntityTypeContext.USER_COVER,
FileAsset.EntityTypeContext.WORKSPACE_LOGO,
FileAsset.EntityTypeContext.PROJECT_COVER,
]:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
)
# Redirect to the signed URL
return HttpResponseRedirect(signed_url)
class AssetRestoreEndpoint(BaseAPIView):
"""Endpoint to restore a deleted assets."""
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST], level="WORKSPACE")
def post(self, request, slug, asset_id):
asset = FileAsset.all_objects.get(id=asset_id, workspace__slug=slug)
asset.is_deleted = False
asset.deleted_at = None
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectAssetEndpoint(BaseAPIView):
"""This endpoint is used to upload cover images/logos etc for workspace, projects and users."""
def get_entity_id_field(self, entity_type, entity_id):
if entity_type == FileAsset.EntityTypeContext.WORKSPACE_LOGO:
return {
"workspace_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
return {
"project_id": entity_id,
}
if entity_type in [
FileAsset.EntityTypeContext.USER_AVATAR,
FileAsset.EntityTypeContext.USER_COVER,
]:
return {
"user_id": entity_id,
}
if entity_type in [
FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
FileAsset.EntityTypeContext.ISSUE_DESCRIPTION,
]:
return {
"issue_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.PAGE_DESCRIPTION:
return {
"page_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.COMMENT_DESCRIPTION:
return {
"comment_id": entity_id,
}
if entity_type == FileAsset.EntityTypeContext.DRAFT_ISSUE_DESCRIPTION:
return {
"draft_issue_id": entity_id,
}
return {}
@allow_permission(
[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST],
)
def post(self, request, slug, project_id):
name = request.data.get("name")
type = request.data.get("type", "image/jpeg")
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
entity_type = request.data.get("entity_type", "")
entity_identifier = request.data.get("entity_identifier")
# Check if the entity type is allowed
if entity_type not in FileAsset.EntityTypeContext.values:
return Response(
{
"error": "Invalid entity type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the file type is allowed
allowed_types = ["image/jpeg", "image/png", "image/webp", "image/jpg"]
if type not in allowed_types:
return Response(
{
"error": "Invalid file type. Only JPEG and PNG files are allowed.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the size limit
size_limit = min(settings.FILE_SIZE_LIMIT, size)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
workspace=workspace,
created_by=request.user,
entity_type=entity_type,
project_id=project_id,
**self.get_entity_id_field(entity_type, entity_identifier),
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@allow_permission(
[ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST],
)
def patch(self, request, slug, project_id, pk):
# get the asset id
asset = FileAsset.objects.get(
id=pk,
)
# get the storage metadata
asset.is_uploaded = True
# get the storage metadata
if not asset.storage_metadata:
get_asset_object_metadata.delay(asset_id=str(pk))
# update the attributes
asset.attributes = request.data.get("attributes", asset.attributes)
# save the asset
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def delete(self, request, slug, project_id, pk):
# Get the asset
asset = FileAsset.objects.get(
id=pk,
workspace__slug=slug,
project_id=project_id,
)
# Check deleted assets
asset.is_deleted = True
asset.deleted_at = timezone.now()
# Save the asset
asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id, pk):
# get the asset id
asset = FileAsset.objects.get(
workspace__slug=slug,
project_id=project_id,
pk=pk,
)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
signed_url = storage.generate_presigned_url(
object_name=asset.asset.name,
)
# Redirect to the signed URL
return HttpResponseRedirect(signed_url)
class ProjectBulkAssetEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def post(self, request, slug, project_id, entity_id):
asset_ids = request.data.get("asset_ids", [])
# Check if the asset ids are provided
if not asset_ids:
return Response(
{
"error": "No asset ids provided.",
},
status=status.HTTP_400_BAD_REQUEST,
)
# get the asset id
assets = FileAsset.objects.filter(
id__in=asset_ids,
workspace__slug=slug,
)
# Get the first asset
asset = assets.first()
if not asset:
return Response(
{
"error": "The requested asset could not be found.",
},
status=status.HTTP_404_NOT_FOUND,
)
# Check if the asset is uploaded
if asset.entity_type == FileAsset.EntityTypeContext.PROJECT_COVER:
assets.update(
project_id=project_id,
)
if asset.entity_type == FileAsset.EntityTypeContext.ISSUE_DESCRIPTION:
assets.update(
issue_id=entity_id,
)
if (
asset.entity_type
== FileAsset.EntityTypeContext.COMMENT_DESCRIPTION
):
assets.update(
comment_id=entity_id,
)
if asset.entity_type == FileAsset.EntityTypeContext.PAGE_DESCRIPTION:
assets.update(
page_id=entity_id,
)
if (
asset.entity_type
== FileAsset.EntityTypeContext.DRAFT_ISSUE_DESCRIPTION
):
assets.update(
draft_issue_id=entity_id,
)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,6 +1,7 @@
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db import models
from django.db.models import (
Case,
CharField,
@@ -18,13 +19,13 @@ from django.db.models import (
Sum,
FloatField,
)
from django.db.models.functions import Coalesce, Cast
from django.db.models.functions import Coalesce, Cast, Concat
from django.utils import timezone
# Third party imports
from rest_framework import status
from rest_framework.response import Response
from plane.app.permissions import ProjectEntityPermission
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import Cycle, UserFavorite, Issue, Label, User, Project
from plane.utils.analytics_plot import burndown_plot
@@ -34,10 +35,6 @@ from .. import BaseAPIView
class CycleArchiveUnarchiveEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get_queryset(self):
favorite_subquery = UserFavorite.objects.filter(
user=self.request.user,
@@ -143,7 +140,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
Prefetch(
"issue_cycle__issue__assignees",
queryset=User.objects.only(
"avatar", "first_name", "id"
"avatar_asset", "first_name", "id"
).distinct(),
)
)
@@ -163,6 +160,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -174,6 +172,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -185,6 +184,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -196,6 +196,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -207,6 +208,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -218,6 +220,7 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__issue__deleted_at__isnull=True,
),
)
)
@@ -292,6 +295,12 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
.distinct()
)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
def get(self, request, slug, project_id, pk=None):
if pk is None:
queryset = (
@@ -398,8 +407,27 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
@@ -492,13 +520,32 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
.annotate(first_name=F("assignees__first_name"))
.annotate(last_name=F("assignees__last_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.annotate(display_name=F("assignees__display_name"))
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"avatar_url",
"display_name",
)
.annotate(
@@ -596,12 +643,13 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def post(self, request, slug, project_id, cycle_id):
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
if cycle.end_date >= timezone.now().date():
if cycle.end_date >= timezone.now():
return Response(
{"error": "Only completed cycles can be archived"},
status=status.HTTP_400_BAD_REQUEST,
@@ -609,11 +657,18 @@ class CycleArchiveUnarchiveEndpoint(BaseAPIView):
cycle.archived_at = timezone.now()
cycle.save()
UserFavorite.objects.filter(
entity_type="cycle",
entity_identifier=cycle_id,
project_id=project_id,
workspace__slug=slug,
).delete()
return Response(
{"archived_at": str(cycle.archived_at)},
status=status.HTTP_200_OK,
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def delete(self, request, slug, project_id, cycle_id):
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug

File diff suppressed because it is too large Load Diff

View File

@@ -3,12 +3,7 @@ import json
# Django imports
from django.core import serializers
from django.db.models import (
F,
Func,
OuterRef,
Q,
)
from django.db.models import F, Func, OuterRef, Q, Subquery
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
@@ -17,21 +12,17 @@ from django.views.decorators.gzip import gzip_page
from rest_framework import status
from rest_framework.response import Response
from plane.app.permissions import (
ProjectEntityPermission,
)
# Module imports
from .. import BaseViewSet
from plane.app.serializers import (
CycleIssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Cycle,
CycleIssue,
Issue,
IssueAttachment,
FileAsset,
IssueLink,
)
from plane.utils.grouper import (
@@ -45,6 +36,7 @@ from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from plane.app.permissions import allow_permission, ROLE
class CycleIssueViewSet(BaseViewSet):
@@ -54,10 +46,6 @@ class CycleIssueViewSet(BaseViewSet):
webhook_event = "cycle_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
filterset_fields = [
"issue__labels__id",
"issue__assignees__id",
@@ -92,6 +80,12 @@ class CycleIssueViewSet(BaseViewSet):
)
@method_decorator(gzip_page)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
def list(self, request, slug, project_id, cycle_id):
order_by_param = request.GET.get("order_by", "created_at")
filters = issue_filters(request.query_params, "GET")
@@ -108,7 +102,13 @@ class CycleIssueViewSet(BaseViewSet):
"issue_cycle__cycle",
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -116,8 +116,9 @@ class CycleIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -238,6 +239,7 @@ class CycleIssueViewSet(BaseViewSet):
),
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def create(self, request, slug, project_id, cycle_id):
issues = request.data.get("issues", [])
@@ -251,10 +253,7 @@ class CycleIssueViewSet(BaseViewSet):
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if cycle.end_date is not None and cycle.end_date < timezone.now():
return Response(
{
"error": "The Cycle has already been completed so no new issues can be added"
@@ -333,6 +332,7 @@ class CycleIssueViewSet(BaseViewSet):
)
return Response({"message": "success"}, status=status.HTTP_201_CREATED)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def destroy(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.filter(
issue_id=issue_id,

View File

@@ -36,13 +36,13 @@ from plane.db.models import (
DashboardWidget,
Issue,
IssueActivity,
IssueAttachment,
FileAsset,
IssueLink,
IssueRelation,
Project,
ProjectMember,
User,
Widget,
WorkspaceMember,
CycleIssue,
)
from plane.utils.issue_filters import issue_filters
@@ -51,36 +51,112 @@ from .. import BaseAPIView
def dashboard_overview_stats(self, request, slug):
assigned_issues = Issue.issue_objects.filter(
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
).count()
assigned_issues = (
Issue.issue_objects.filter(
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__role=5,
project__guest_view_all_features=False,
created_by=self.request.user,
)
|
# For other roles (role < 5), show all issues
Q(project__project_projectmember__role__gt=5),
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.count()
)
pending_issues_count = Issue.issue_objects.filter(
~Q(state__group__in=["completed", "cancelled"]),
target_date__lt=timezone.now().date(),
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
).count()
pending_issues_count = (
Issue.issue_objects.filter(
~Q(state__group__in=["completed", "cancelled"]),
target_date__lt=timezone.now().date(),
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__role=5,
project__guest_view_all_features=False,
created_by=self.request.user,
)
|
# For other roles (role < 5), show all issues
Q(project__project_projectmember__role__gt=5),
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.count()
)
created_issues_count = Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
created_by_id=request.user.id,
).count()
created_issues_count = (
Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
created_by_id=request.user.id,
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__role=5,
project__guest_view_all_features=False,
created_by=self.request.user,
)
|
# For other roles (role < 5), show all issues
Q(project__project_projectmember__role__gt=5),
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.count()
)
completed_issues_count = Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
assignees__in=[request.user],
state__group="completed",
).count()
completed_issues_count = (
Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
assignees__in=[request.user],
state__group="completed",
)
.filter(
Q(
project__project_projectmember__role=5,
project__guest_view_all_features=True,
)
| Q(
project__project_projectmember__role=5,
project__guest_view_all_features=False,
created_by=self.request.user,
)
|
# For other roles (role < 5), show all issues
Q(project__project_projectmember__role__gt=5),
project__project_projectmember__member=self.request.user,
project__project_projectmember__is_active=True,
)
.count()
)
return Response(
{
@@ -116,7 +192,13 @@ def dashboard_assigned_issues(self, request, slug):
).select_related("issue"),
)
)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -124,8 +206,9 @@ def dashboard_assigned_issues(self, request, slug):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -142,7 +225,10 @@ def dashboard_assigned_issues(self, request, slug):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -150,8 +236,11 @@ def dashboard_assigned_issues(self, request, slug):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -159,13 +248,25 @@ def dashboard_assigned_issues(self, request, slug):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
)
if WorkspaceMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=5,
is_active=True,
).exists():
assigned_issues = assigned_issues.filter(created_by=request.user)
# Priority Ordering
priority_order = ["urgent", "high", "medium", "low", "none"]
assigned_issues = assigned_issues.annotate(
@@ -271,7 +372,13 @@ def dashboard_created_issues(self, request, slug):
.filter(**filters)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -279,8 +386,9 @@ def dashboard_created_issues(self, request, slug):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -297,7 +405,10 @@ def dashboard_created_issues(self, request, slug):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -305,8 +416,11 @@ def dashboard_created_issues(self, request, slug):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -314,7 +428,11 @@ def dashboard_created_issues(self, request, slug):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -409,6 +527,16 @@ def dashboard_created_issues(self, request, slug):
def dashboard_issues_by_state_groups(self, request, slug):
filters = issue_filters(request.query_params, "GET")
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
extra_filters = {}
if WorkspaceMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=5,
is_active=True,
).exists():
extra_filters = {"created_by": request.user}
issues_by_state_groups = (
Issue.issue_objects.filter(
workspace__slug=slug,
@@ -416,7 +544,7 @@ def dashboard_issues_by_state_groups(self, request, slug):
project__project_projectmember__member=request.user,
assignees__in=[request.user],
)
.filter(**filters)
.filter(**filters, **extra_filters)
.values("state__group")
.annotate(count=Count("id"))
)
@@ -439,6 +567,15 @@ def dashboard_issues_by_state_groups(self, request, slug):
def dashboard_issues_by_priority(self, request, slug):
filters = issue_filters(request.query_params, "GET")
priority_order = ["urgent", "high", "medium", "low", "none"]
extra_filters = {}
if WorkspaceMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=5,
is_active=True,
).exists():
extra_filters = {"created_by": request.user}
issues_by_priority = (
Issue.issue_objects.filter(
@@ -447,7 +584,7 @@ def dashboard_issues_by_priority(self, request, slug):
project__project_projectmember__member=request.user,
assignees__in=[request.user],
)
.filter(**filters)
.filter(**filters, **extra_filters)
.values("priority")
.annotate(count=Count("id"))
)
@@ -521,105 +658,42 @@ def dashboard_recent_projects(self, request, slug):
def dashboard_recent_collaborators(self, request, slug):
# Subquery to count activities for each project member
activity_count_subquery = (
IssueActivity.objects.filter(
workspace__slug=slug,
actor=OuterRef("member"),
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
project__archived_at__isnull=True,
)
.values("actor")
.annotate(num_activities=Count("pk"))
.values("num_activities")
)
# Get all project members and annotate them with activity counts
project_members_with_activities = (
ProjectMember.objects.filter(
WorkspaceMember.objects.filter(
workspace__slug=slug,
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
project__archived_at__isnull=True,
is_active=True,
)
.annotate(
num_activities=Coalesce(
Subquery(activity_count_subquery),
Value(0),
output_field=IntegerField(),
),
is_current_user=Case(
When(member=request.user, then=Value(0)),
default=Value(1),
output_field=IntegerField(),
active_issue_count=Count(
Case(
When(
member__issue_assignee__issue__state__group__in=[
"unstarted",
"started",
],
member__issue_assignee__issue__workspace__slug=slug,
member__issue_assignee__issue__project__project_projectmember__member=request.user,
member__issue_assignee__issue__project__project_projectmember__is_active=True,
then=F("member__issue_assignee__issue__id"),
),
distinct=True,
output_field=IntegerField(),
),
distinct=True,
),
user_id=F("member_id"),
)
.values_list("member", flat=True)
.order_by("is_current_user", "-num_activities")
.values("user_id", "active_issue_count")
.order_by("-active_issue_count")
.distinct()
)
search = request.query_params.get("search", None)
if search:
project_members_with_activities = (
project_members_with_activities.filter(
Q(member__display_name__icontains=search)
| Q(member__first_name__icontains=search)
| Q(member__last_name__icontains=search)
)
)
return self.paginate(
request=request,
queryset=project_members_with_activities,
controller=lambda qs: self.get_results_controller(qs, slug),
return Response(
(project_members_with_activities),
status=status.HTTP_200_OK,
)
class DashboardEndpoint(BaseAPIView):
def get_results_controller(self, project_members_with_activities, slug):
user_active_issue_counts = (
User.objects.filter(
id__in=project_members_with_activities,
)
.annotate(
active_issue_count=Count(
Case(
When(
issue_assignee__issue__state__group__in=[
"unstarted",
"started",
],
issue_assignee__issue__workspace__slug=slug,
issue_assignee__issue__project__project_projectmember__is_active=True,
then=F("issue_assignee__issue__id"),
),
output_field=IntegerField(),
),
distinct=True,
)
)
.values("active_issue_count", user_id=F("id"))
)
# Create a dictionary to store the active issue counts by user ID
active_issue_counts_dict = {
user["user_id"]: user["active_issue_count"]
for user in user_active_issue_counts
}
# Preserve the sequence of project members with activities
paginated_results = [
{
"user_id": member_id,
"active_issue_count": active_issue_counts_dict.get(
member_id, 0
),
}
for member_id in project_members_with_activities
]
return paginated_results
def create(self, request, slug):
serializer = DashboardSerializer(data=request.data)
if serializer.is_valid():

View File

@@ -1,5 +1,9 @@
import random
import string
import json
# Django imports
from django.utils import timezone
# Third party imports
from rest_framework.response import Response
@@ -7,7 +11,11 @@ from rest_framework import status
# Module imports
from ..base import BaseViewSet, BaseAPIView
from plane.app.permissions import ProjectEntityPermission
from plane.app.permissions import (
ProjectEntityPermission,
allow_permission,
ROLE,
)
from plane.db.models import Project, Estimate, EstimatePoint, Issue
from plane.app.serializers import (
EstimateSerializer,
@@ -15,6 +23,7 @@ from plane.app.serializers import (
EstimateReadSerializer,
)
from plane.utils.cache import invalidate_cache
from plane.bgtasks.issue_activities_task import issue_activity
def generate_random_name(length=10):
@@ -23,10 +32,13 @@ def generate_random_name(length=10):
class ProjectEstimatePointEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
def get(self, request, slug, project_id):
project = Project.objects.get(workspace__slug=slug, pk=project_id)
if project.estimate_id is not None:
@@ -189,10 +201,8 @@ class BulkEstimatePointEndpoint(BaseViewSet):
class EstimatePointEndpoint(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def create(self, request, slug, project_id, estimate_id):
# TODO: add a key validation if the same key already exists
if not request.data.get("key") or not request.data.get("value"):
@@ -211,6 +221,7 @@ class EstimatePointEndpoint(BaseViewSet):
serializer = EstimatePointSerializer(estimate_point).data
return Response(serializer, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def partial_update(
self, request, slug, project_id, estimate_id, estimate_point_id
):
@@ -231,6 +242,7 @@ class EstimatePointEndpoint(BaseViewSet):
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def destroy(
self, request, slug, project_id, estimate_id, estimate_point_id
):
@@ -242,11 +254,66 @@ class EstimatePointEndpoint(BaseViewSet):
)
# update all the issues with the new estimate
if new_estimate_id:
_ = Issue.objects.filter(
issues = Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
estimate_point_id=estimate_point_id,
).update(estimate_point_id=new_estimate_id)
)
for issue in issues:
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{
"estimate_point": (
str(new_estimate_id)
if new_estimate_id
else None
),
}
),
actor_id=str(request.user.id),
issue_id=issue.id,
project_id=str(project_id),
current_instance=json.dumps(
{
"estimate_point": (
str(issue.estimate_point_id)
if issue.estimate_point_id
else None
),
}
),
epoch=int(timezone.now().timestamp()),
)
issues.update(estimate_point_id=new_estimate_id)
else:
issues = Issue.objects.filter(
project_id=project_id,
workspace__slug=slug,
estimate_point_id=estimate_point_id,
)
for issue in issues:
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps(
{
"estimate_point": None,
}
),
actor_id=str(request.user.id),
issue_id=issue.id,
project_id=str(project_id),
current_instance=json.dumps(
{
"estimate_point": (
str(issue.estimate_point_id)
if issue.estimate_point_id
else None
),
}
),
epoch=int(timezone.now().timestamp()),
)
# delete the estimate point
old_estimate_point = EstimatePoint.objects.filter(

View File

@@ -2,7 +2,7 @@
from rest_framework import status
from rest_framework.response import Response
from plane.app.permissions import WorkSpaceAdminPermission
from plane.app.permissions import allow_permission, ROLE
from plane.app.serializers import ExporterHistorySerializer
from plane.bgtasks.export_task import issue_export_task
from plane.db.models import ExporterHistory, Project, Workspace
@@ -12,12 +12,10 @@ from .. import BaseAPIView
class ExportIssuesEndpoint(BaseAPIView):
permission_classes = [
WorkSpaceAdminPermission,
]
model = ExporterHistory
serializer_class = ExporterHistorySerializer
@allow_permission(allowed_roles=[ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE")
def post(self, request, slug):
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
@@ -64,6 +62,9 @@ class ExportIssuesEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE"
)
def get(self, request, slug):
exporter_history = ExporterHistory.objects.filter(
workspace__slug=slug,

View File

@@ -11,7 +11,7 @@ from rest_framework import status
# Module imports
from ..base import BaseAPIView
from plane.app.permissions import ProjectEntityPermission, WorkspaceEntityPermission
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import Workspace, Project
from plane.app.serializers import (
ProjectLiteSerializer,
@@ -21,10 +21,8 @@ from plane.license.utils.instance_value import get_configuration_value
class GPTIntegrationEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def post(self, request, slug, project_id):
OPENAI_API_KEY, GPT_ENGINE = get_configuration_value(
[
@@ -84,10 +82,10 @@ class GPTIntegrationEndpoint(BaseAPIView):
class WorkspaceGPTIntegrationEndpoint(BaseAPIView):
permission_classes = [
WorkspaceEntityPermission,
]
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER], level="WORKSPACE"
)
def post(self, request, slug):
OPENAI_API_KEY, GPT_ENGINE = get_configuration_value(
[

View File

@@ -3,7 +3,7 @@ import json
# Django import
from django.utils import timezone
from django.db.models import Q, Count, OuterRef, Func, F, Prefetch
from django.db.models import Q, Count, OuterRef, Func, F, Prefetch, Subquery
from django.core.serializers.json import DjangoJSONEncoder
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
@@ -16,16 +16,17 @@ from rest_framework.response import Response
# Module imports
from ..base import BaseViewSet
from plane.app.permissions import ProjectBasePermission, ProjectLitePermission
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import (
Inbox,
InboxIssue,
Issue,
State,
IssueLink,
IssueAttachment,
FileAsset,
Project,
ProjectMember,
CycleIssue,
)
from plane.app.serializers import (
IssueCreateSerializer,
@@ -35,13 +36,10 @@ from plane.app.serializers import (
InboxIssueDetailSerializer,
)
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
class InboxViewSet(BaseViewSet):
permission_classes = [
ProjectBasePermission,
]
serializer_class = InboxSerializer
model = Inbox
@@ -63,6 +61,7 @@ class InboxViewSet(BaseViewSet):
.select_related("workspace", "project")
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def list(self, request, slug, project_id):
inbox = self.get_queryset().first()
return Response(
@@ -70,9 +69,11 @@ class InboxViewSet(BaseViewSet):
status=status.HTTP_200_OK,
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def perform_create(self, serializer):
serializer.save(project_id=self.kwargs.get("project_id"))
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id, pk=pk
@@ -88,9 +89,6 @@ class InboxViewSet(BaseViewSet):
class InboxIssueViewSet(BaseViewSet):
permission_classes = [
ProjectLitePermission,
]
serializer_class = InboxIssueSerializer
model = InboxIssue
@@ -115,7 +113,13 @@ class InboxIssueViewSet(BaseViewSet):
),
)
)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -123,8 +127,9 @@ class InboxIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -143,7 +148,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -151,8 +159,11 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -160,18 +171,23 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
inbox_id = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(pk=project_id)
filters = issue_filters(request.GET, "GET", "issue__")
inbox_issue = (
InboxIssue.objects.filter(
@@ -186,7 +202,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(issue__label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
)
@@ -201,6 +220,17 @@ class InboxIssueViewSet(BaseViewSet):
if inbox_status:
inbox_issue = inbox_issue.filter(status__in=inbox_status)
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
).exists()
and not project.guest_view_all_features
):
inbox_issue = inbox_issue.filter(created_by=request.user)
return self.paginate(
request=request,
queryset=(inbox_issue),
@@ -210,6 +240,7 @@ class InboxIssueViewSet(BaseViewSet):
).data,
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def create(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
@@ -286,7 +317,12 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(
issue__label_issue__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -294,7 +330,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__assignees__id",
distinct=True,
filter=~Q(issue__assignees__id__isnull=True),
filter=~Q(issue__assignees__id__isnull=True)
& Q(
issue__assignees__member_project__is_active=True
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -312,12 +351,13 @@ class InboxIssueViewSet(BaseViewSet):
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
def partial_update(self, request, slug, project_id, issue_id):
@allow_permission(allowed_roles=[ROLE.ADMIN], creator=True, model=Issue)
def partial_update(self, request, slug, project_id, pk):
inbox_id = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
issue_id=pk,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox_id,
@@ -330,7 +370,7 @@ class InboxIssueViewSet(BaseViewSet):
is_active=True,
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 10 and str(inbox_issue.created_by_id) != str(
if project_member.role <= 5 and str(inbox_issue.created_by_id) != str(
request.user.id
):
return Response(
@@ -346,7 +386,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -354,7 +397,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -363,9 +409,8 @@ class InboxIssueViewSet(BaseViewSet):
workspace__slug=slug,
project_id=project_id,
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
# viewers and guests since only viewers and guests
# Only allow guests to edit name and description
if project_member.role <= 5:
issue_data = {
"name": issue_data.get("name", issue.name),
"description_html": issue_data.get(
@@ -407,7 +452,7 @@ class InboxIssueViewSet(BaseViewSet):
)
# Only project admins and members can edit inbox issue attributes
if project_member.role > 10:
if project_member.role > 15:
serializer = InboxIssueSerializer(
inbox_issue, data=request.data, partial=True
)
@@ -458,7 +503,7 @@ class InboxIssueViewSet(BaseViewSet):
request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(issue_id),
issue_id=str(pk),
project_id=str(project_id),
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
@@ -478,7 +523,12 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(
issue__label_issue__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -486,14 +536,19 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__assignees__id",
distinct=True,
filter=~Q(issue__assignees__id__isnull=True),
filter=Q(
~Q(issue__assignees__id__isnull=True)
& Q(
issue__issue_assignee__deleted_at__isnull=True
)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.get(
inbox_id=inbox_id.id,
issue_id=issue_id,
issue_id=pk,
project_id=project_id,
)
)
@@ -506,10 +561,20 @@ class InboxIssueViewSet(BaseViewSet):
serializer = InboxIssueDetailSerializer(inbox_issue).data
return Response(serializer, status=status.HTTP_200_OK)
def retrieve(self, request, slug, project_id, issue_id):
@allow_permission(
allowed_roles=[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
],
creator=True,
model=Issue,
)
def retrieve(self, request, slug, project_id, pk):
inbox_id = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(pk=project_id)
inbox_issue = (
InboxIssue.objects.select_related("issue")
.prefetch_related(
@@ -521,7 +586,10 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__labels__id",
distinct=True,
filter=~Q(issue__labels__id__isnull=True),
filter=Q(
~Q(issue__labels__id__isnull=True)
& Q(issue__label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -529,27 +597,44 @@ class InboxIssueViewSet(BaseViewSet):
ArrayAgg(
"issue__assignees__id",
distinct=True,
filter=~Q(issue__assignees__id__isnull=True),
filter=Q(
~Q(issue__assignees__id__isnull=True)
& Q(issue__issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.get(
inbox_id=inbox_id.id, issue_id=issue_id, project_id=project_id
)
.get(inbox_id=inbox_id.id, issue_id=pk, project_id=project_id)
)
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
).exists()
and not project.guest_view_all_features
and not inbox_issue.created_by == request.user
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
)
issue = InboxIssueDetailSerializer(inbox_issue).data
return Response(
issue,
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, project_id, issue_id):
@allow_permission(allowed_roles=[ROLE.ADMIN], creator=True, model=Issue)
def destroy(self, request, slug, project_id, pk):
inbox_id = Inbox.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
inbox_issue = InboxIssue.objects.get(
issue_id=issue_id,
issue_id=pk,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox_id,
@@ -559,21 +644,8 @@ class InboxIssueViewSet(BaseViewSet):
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
issue = Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=issue_id
workspace__slug=slug, project_id=project_id, pk=pk
).first()
if issue.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=20,
project_id=project_id,
is_active=True,
).exists()
):
return Response(
{"error": "Only admin or creator can delete the issue"},
status=status.HTTP_403_FORBIDDEN,
)
issue.delete()
inbox_issue.delete()

View File

@@ -19,7 +19,11 @@ from plane.app.serializers import (
IssueActivitySerializer,
IssueCommentSerializer,
)
from plane.app.permissions import ProjectEntityPermission
from plane.app.permissions import (
ProjectEntityPermission,
allow_permission,
ROLE,
)
from plane.db.models import (
IssueActivity,
IssueComment,
@@ -33,6 +37,13 @@ class IssueActivityEndpoint(BaseAPIView):
]
@method_decorator(gzip_page)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def get(self, request, slug, project_id, issue_id):
filters = {}
if request.GET.get("created_at__gt", None) is not None:

View File

@@ -3,14 +3,7 @@ import json
# Django imports
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
F,
Func,
OuterRef,
Q,
Prefetch,
Exists,
)
from django.db.models import F, Func, OuterRef, Q, Prefetch, Exists, Subquery
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
@@ -25,15 +18,16 @@ from plane.app.permissions import (
from plane.app.serializers import (
IssueFlatSerializer,
IssueSerializer,
IssueDetailSerializer
IssueDetailSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
IssueSubscriber,
IssueReaction,
CycleIssue
)
from plane.utils.grouper import (
issue_group_values,
@@ -46,15 +40,14 @@ from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from plane.app.permissions import allow_permission, ROLE
from plane.utils.error_codes import ERROR_CODES
# Module imports
from .. import BaseViewSet, BaseAPIView
class IssueArchiveViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]
serializer_class = IssueFlatSerializer
model = Issue
@@ -72,7 +65,13 @@ class IssueArchiveViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -80,8 +79,9 @@ class IssueArchiveViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -98,6 +98,12 @@ class IssueArchiveViewSet(BaseViewSet):
)
@method_decorator(gzip_page)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
show_sub_issues = request.GET.get("show_sub_issues", "true")
@@ -213,6 +219,12 @@ class IssueArchiveViewSet(BaseViewSet):
),
)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
]
)
def retrieve(self, request, slug, project_id, pk=None):
issue = (
self.get_queryset()
@@ -225,12 +237,6 @@ class IssueArchiveViewSet(BaseViewSet):
),
)
)
.prefetch_related(
Prefetch(
"issue_attachment",
queryset=IssueAttachment.objects.select_related("issue"),
)
)
.prefetch_related(
Prefetch(
"issue_link",
@@ -256,6 +262,7 @@ class IssueArchiveViewSet(BaseViewSet):
serializer = IssueDetailSerializer(issue, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def archive(self, request, slug, project_id, pk=None):
issue = Issue.issue_objects.get(
workspace__slug=slug,
@@ -294,6 +301,7 @@ class IssueArchiveViewSet(BaseViewSet):
{"archived_at": str(issue.archived_at)}, status=status.HTTP_200_OK
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def unarchive(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug,
@@ -325,6 +333,7 @@ class BulkArchiveIssuesEndpoint(BaseAPIView):
ProjectEntityPermission,
]
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def post(self, request, slug, project_id):
issue_ids = request.data.get("issue_ids", [])
@@ -342,8 +351,10 @@ class BulkArchiveIssuesEndpoint(BaseAPIView):
if issue.state.group not in ["completed", "cancelled"]:
return Response(
{
"error_code": 4091,
"error_message": "INVALID_ARCHIVE_STATE_GROUP"
"error_code": ERROR_CODES[
"INVALID_ARCHIVE_STATE_GROUP"
],
"error_message": "INVALID_ARCHIVE_STATE_GROUP",
},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -1,9 +1,12 @@
# Python imports
import json
import uuid
# Django imports
from django.utils import timezone
from django.core.serializers.json import DjangoJSONEncoder
from django.conf import settings
from django.http import HttpResponseRedirect
# Third Party imports
from rest_framework.response import Response
@@ -13,23 +16,29 @@ from rest_framework.parsers import MultiPartParser, FormParser
# Module imports
from .. import BaseAPIView
from plane.app.serializers import IssueAttachmentSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import IssueAttachment, ProjectMember
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import FileAsset, Workspace
from plane.bgtasks.issue_activities_task import issue_activity
from plane.app.permissions import allow_permission, ROLE
from plane.settings.storage import S3Storage
from plane.bgtasks.storage_metadata_task import get_asset_object_metadata
class IssueAttachmentEndpoint(BaseAPIView):
serializer_class = IssueAttachmentSerializer
permission_classes = [
ProjectEntityPermission,
]
model = IssueAttachment
model = FileAsset
parser_classes = (MultiPartParser, FormParser)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def post(self, request, slug, project_id, issue_id):
serializer = IssueAttachmentSerializer(data=request.data)
workspace = Workspace.objects.get(slug=slug)
if serializer.is_valid():
serializer.save(project_id=project_id, issue_id=issue_id)
serializer.save(
project_id=project_id,
issue_id=issue_id,
workspace_id=workspace.id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
issue_activity.delay(
type="attachment.activity.created",
requested_data=None,
@@ -47,21 +56,9 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission([ROLE.ADMIN], creator=True, model=FileAsset)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = IssueAttachment.objects.get(pk=pk)
if issue_attachment.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=20,
project_id=project_id,
is_active=True,
).exists()
):
return Response(
{"error": "Only admin or creator can delete the attachment"},
status=status.HTTP_403_FORBIDDEN,
)
issue_attachment = FileAsset.objects.get(pk=pk)
issue_attachment.asset.delete(save=False)
issue_attachment.delete()
issue_activity.delay(
@@ -78,9 +75,187 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def get(self, request, slug, project_id, issue_id):
issue_attachments = IssueAttachment.objects.filter(
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachments, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueAttachmentV2Endpoint(BaseAPIView):
serializer_class = IssueAttachmentSerializer
model = FileAsset
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def post(self, request, slug, project_id, issue_id):
name = request.data.get("name")
type = request.data.get("type", False)
size = int(request.data.get("size", settings.FILE_SIZE_LIMIT))
if not type or type not in settings.ATTACHMENT_MIME_TYPES:
return Response(
{
"error": "Invalid file type.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the workspace
workspace = Workspace.objects.get(slug=slug)
# asset key
asset_key = f"{workspace.id}/{uuid.uuid4().hex}-{name}"
# Get the size limit
size_limit = min(size, settings.FILE_SIZE_LIMIT)
# Create a File Asset
asset = FileAsset.objects.create(
attributes={
"name": name,
"type": type,
"size": size_limit,
},
asset=asset_key,
size=size_limit,
workspace_id=workspace.id,
created_by=request.user,
issue_id=issue_id,
project_id=project_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
# Get the presigned URL
storage = S3Storage(request=request)
# Generate a presigned URL to share an S3 object
presigned_url = storage.generate_presigned_post(
object_name=asset_key,
file_type=type,
file_size=size_limit,
)
# Return the presigned URL
return Response(
{
"upload_data": presigned_url,
"asset_id": str(asset.id),
"attachment": IssueAttachmentSerializer(asset).data,
"asset_url": asset.asset_url,
},
status=status.HTTP_200_OK,
)
@allow_permission([ROLE.ADMIN], creator=True, model=FileAsset)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = FileAsset.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
issue_attachment.is_deleted = True
issue_attachment.deleted_at = timezone.now()
issue_attachment.save()
issue_activity.delay(
type="attachment.activity.deleted",
requested_data=None,
actor_id=str(self.request.user.id),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def get(self, request, slug, project_id, issue_id, pk=None):
if pk:
# Get the asset
asset = FileAsset.objects.get(
id=pk, workspace__slug=slug, project_id=project_id
)
# Check if the asset is uploaded
if not asset.is_uploaded:
return Response(
{
"error": "The asset is not uploaded.",
"status": False,
},
status=status.HTTP_400_BAD_REQUEST,
)
storage = S3Storage(request=request)
presigned_url = storage.generate_presigned_url(
object_name=asset.asset.name,
disposition="attachment",
filename=asset.attributes.get("name"),
)
return HttpResponseRedirect(presigned_url)
# Get all the attachments
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
workspace__slug=slug,
project_id=project_id,
is_uploaded=True,
)
# Serialize the attachments
serializer = IssueAttachmentSerializer(issue_attachments, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def patch(self, request, slug, project_id, issue_id, pk):
issue_attachment = FileAsset.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachment)
# Send this activity only if the attachment is not uploaded before
if not issue_attachment.is_uploaded:
issue_activity.delay(
type="attachment.activity.created",
requested_data=None,
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id", None)),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
serializer.data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
# Update the attachment
issue_attachment.is_uploaded = True
# Get the storage metadata
if not issue_attachment.storage_metadata:
get_asset_object_metadata.delay(str(issue_attachment.id))
issue_attachment.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -14,6 +14,7 @@ from django.db.models import (
Q,
UUIDField,
Value,
Subquery,
)
from django.db.models.functions import Coalesce
from django.utils import timezone
@@ -25,26 +26,24 @@ from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import (
ProjectEntityPermission,
ProjectLitePermission,
)
from plane.app.permissions import allow_permission, ROLE
from plane.app.serializers import (
IssueCreateSerializer,
IssueDetailSerializer,
IssueUserPropertySerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
IssueUserProperty,
IssueReaction,
IssueSubscriber,
Project,
ProjectMember,
CycleIssue,
)
from plane.utils.grouper import (
issue_group_values,
@@ -59,15 +58,13 @@ from plane.utils.paginator import (
)
from .. import BaseAPIView, BaseViewSet
from plane.utils.user_timezone_converter import user_timezone_converter
# Module imports
from plane.bgtasks.recent_visited_task import recent_visited_task
from plane.utils.global_paginator import paginate
from plane.bgtasks.webhook_task import model_activity
class IssueListEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id):
issue_ids = request.GET.get("issues", False)
@@ -88,7 +85,13 @@ class IssueListEndpoint(BaseAPIView):
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -96,8 +99,9 @@ class IssueListEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -134,6 +138,14 @@ class IssueListEndpoint(BaseAPIView):
sub_group_by=sub_group_by,
)
recent_visited_task.delay(
slug=slug,
project_id=project_id,
entity_name="project",
entity_identifier=project_id,
user_id=request.user.id,
)
if self.fields or self.expand:
issues = IssueSerializer(
queryset, many=True, fields=self.fields, expand=self.expand
@@ -184,9 +196,6 @@ class IssueViewSet(BaseViewSet):
model = Issue
webhook_event = "issue"
permission_classes = [
ProjectEntityPermission,
]
search_fields = [
"name",
@@ -206,7 +215,13 @@ class IssueViewSet(BaseViewSet):
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -214,8 +229,9 @@ class IssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -232,11 +248,19 @@ class IssueViewSet(BaseViewSet):
).distinct()
@method_decorator(gzip_page)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
extra_filters = {}
if request.GET.get("updated_at__gt", None) is not None:
extra_filters = {
"updated_at__gt": request.GET.get("updated_at__gt")
}
project = Project.objects.get(pk=project_id, workspace__slug=slug)
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
issue_queryset = self.get_queryset().filter(**filters, **extra_filters)
# Custom ordering for priority and state
# Issue queryset
@@ -256,6 +280,25 @@ class IssueViewSet(BaseViewSet):
sub_group_by=sub_group_by,
)
recent_visited_task.delay(
slug=slug,
project_id=project_id,
entity_name="project",
entity_identifier=project_id,
user_id=request.user.id,
)
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
).exists()
and not project.guest_view_all_features
):
issue_queryset = issue_queryset.filter(created_by=request.user)
if group_by:
if sub_group_by:
if group_by == sub_group_by:
@@ -337,6 +380,7 @@ class IssueViewSet(BaseViewSet):
),
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER])
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -408,10 +452,31 @@ class IssueViewSet(BaseViewSet):
issue = user_timezone_converter(
issue, datetime_fields, request.user.user_timezone
)
# Send the model activity
model_activity.delay(
model_name="issue",
model_id=str(serializer.data["id"]),
requested_data=request.data,
current_instance=None,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(issue, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission(
allowed_roles=[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
],
creator=True,
model=Issue,
)
def retrieve(self, request, slug, project_id, pk=None):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
issue = (
self.get_queryset()
.filter(pk=pk)
@@ -420,7 +485,10 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -428,8 +496,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -437,8 +508,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -451,12 +525,6 @@ class IssueViewSet(BaseViewSet):
),
)
)
.prefetch_related(
Prefetch(
"issue_attachment",
queryset=IssueAttachment.objects.select_related("issue"),
)
)
.prefetch_related(
Prefetch(
"issue_link",
@@ -480,9 +548,41 @@ class IssueViewSet(BaseViewSet):
status=status.HTTP_404_NOT_FOUND,
)
"""
if the role is guest and guest_view_all_features is false and owned by is not
the requesting user then dont show the issue
"""
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
).exists()
and not project.guest_view_all_features
and not issue.created_by == request.user
):
return Response(
{"error": "You are not allowed to view this issue"},
status=status.HTTP_400_BAD_REQUEST,
)
recent_visited_task.delay(
slug=slug,
entity_name="issue",
entity_identifier=pk,
user_id=request.user.id,
project_id=project_id,
)
serializer = IssueDetailSerializer(issue, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
@allow_permission(
allowed_roles=[ROLE.ADMIN, ROLE.MEMBER], creator=True, model=Issue
)
def partial_update(self, request, slug, project_id, pk=None):
issue = (
self.get_queryset()
@@ -491,7 +591,10 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -499,8 +602,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -508,7 +614,11 @@ class IssueViewSet(BaseViewSet):
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -544,27 +654,23 @@ class IssueViewSet(BaseViewSet):
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
issue = self.get_queryset().filter(pk=pk).first()
model_activity.delay(
model_name="issue",
model_id=str(serializer.data.get("id", None)),
requested_data=request.data,
current_instance=current_instance,
actor_id=request.user.id,
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission([ROLE.ADMIN], creator=True, model=Issue)
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
if issue.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=20,
project_id=project_id,
is_active=True,
).exists()
):
return Response(
{"error": "Only admin or creator can delete the issue"},
status=status.HTTP_403_FORBIDDEN,
)
issue.delete()
issue_activity.delay(
@@ -582,10 +688,7 @@ class IssueViewSet(BaseViewSet):
class IssueUserDisplayPropertyEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def patch(self, request, slug, project_id):
issue_property = IssueUserProperty.objects.get(
user=request.user,
@@ -605,6 +708,13 @@ class IssueUserDisplayPropertyEndpoint(BaseAPIView):
serializer = IssueUserPropertySerializer(issue_property)
return Response(serializer.data, status=status.HTTP_201_CREATED)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def get(self, request, slug, project_id):
issue_property, _ = IssueUserProperty.objects.get_or_create(
user=request.user, project_id=project_id
@@ -614,23 +724,8 @@ class IssueUserDisplayPropertyEndpoint(BaseAPIView):
class BulkDeleteIssuesEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
@allow_permission([ROLE.ADMIN])
def delete(self, request, slug, project_id):
if ProjectMember.objects.filter(
workspace__slug=slug,
member=request.user,
role__in=[15, 10, 5],
project_id=project_id,
is_active=True,
).exists():
return Response(
{"error": "Only admin can perform this action"},
status=status.HTTP_403_FORBIDDEN,
)
issue_ids = request.data.get("issue_ids", [])
if not len(issue_ids):
@@ -651,3 +746,194 @@ class BulkDeleteIssuesEndpoint(BaseAPIView):
{"message": f"{total_issues} issues were deleted"},
status=status.HTTP_200_OK,
)
class DeletedIssuesListViewSet(BaseAPIView):
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def get(self, request, slug, project_id):
filters = {}
if request.GET.get("updated_at__gt", None) is not None:
filters = {"updated_at__gt": request.GET.get("updated_at__gt")}
deleted_issues = (
Issue.all_objects.filter(
workspace__slug=slug,
project_id=project_id,
)
.filter(Q(archived_at__isnull=False) | Q(deleted_at__isnull=False))
.filter(**filters)
.values_list("id", flat=True)
)
return Response(deleted_issues, status=status.HTTP_200_OK)
class IssuePaginatedViewSet(BaseViewSet):
def get_queryset(self):
workspace_slug = self.kwargs.get("slug")
project_id = self.kwargs.get("project_id")
issue_queryset = Issue.issue_objects.filter(
workspace__slug=workspace_slug, project_id=project_id
)
return (
issue_queryset.select_related(
"workspace", "project", "state", "parent"
)
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
).distinct()
def process_paginated_result(self, fields, results, timezone):
paginated_data = results.values(*fields)
# converting the datetime fields in paginated data
datetime_fields = ["created_at", "updated_at"]
paginated_data = user_timezone_converter(
paginated_data, datetime_fields, timezone
)
return paginated_data
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def list(self, request, slug, project_id):
cursor = request.GET.get("cursor", None)
is_description_required = request.GET.get("description", "false")
updated_at = request.GET.get("updated_at__gt", None)
# required fields
required_fields = [
"id",
"name",
"state_id",
"state__group",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"created_at",
"updated_at",
"created_by",
"updated_by",
"is_draft",
"archived_at",
"module_ids",
"label_ids",
"assignee_ids",
"link_count",
"attachment_count",
"sub_issues_count",
]
if str(is_description_required).lower() == "true":
required_fields.append("description_html")
# querying issues
base_queryset = Issue.issue_objects.filter(
workspace__slug=slug, project_id=project_id
)
base_queryset = base_queryset.order_by("updated_at")
queryset = self.get_queryset().order_by("updated_at")
# validation for guest user
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project_member = ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
)
if project_member.exists() and not project.guest_view_all_features:
base_queryset = base_queryset.filter(created_by=request.user)
queryset = queryset.filter(created_by=request.user)
# filtering issues by greater then updated_at given by the user
if updated_at:
base_queryset = base_queryset.filter(updated_at__gt=updated_at)
queryset = queryset.filter(updated_at__gt=updated_at)
queryset = queryset.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True),
),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=Q(
~Q(issue_module__module_id__isnull=True)
& Q(issue_module__module__archived_at__isnull=True)
& Q(issue_module__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
paginated_data = paginate(
base_queryset=base_queryset,
queryset=queryset,
cursor=cursor,
on_result=lambda results: self.process_paginated_result(
required_fields, results, request.user.user_timezone
),
)
return Response(paginated_data, status=status.HTTP_200_OK)

View File

@@ -1,288 +0,0 @@
# Python imports
import json
from datetime import datetime
# Django imports
from django.utils import timezone
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
# Module imports
from .. import BaseAPIView
from plane.app.permissions import (
ProjectEntityPermission,
)
from plane.db.models import (
Project,
Issue,
IssueLabel,
IssueAssignee,
)
from plane.bgtasks.issue_activites_task import issue_activity
class BulkIssueOperationsEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def post(self, request, slug, project_id):
issue_ids = request.data.get("issue_ids", [])
if not len(issue_ids):
return Response(
{"error": "Issue IDs are required"},
status=status.HTTP_400_BAD_REQUEST,
)
# Get all the issues
issues = (
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk__in=issue_ids
)
.select_related("state")
.prefetch_related("labels", "assignees")
)
# Current epoch
epoch = int(timezone.now().timestamp())
# Project details
project = Project.objects.get(workspace__slug=slug, pk=project_id)
workspace_id = project.workspace_id
# Initialize arrays
bulk_update_issues = []
bulk_issue_activities = []
bulk_update_issue_labels = []
bulk_update_issue_assignees = []
properties = request.data.get("properties", {})
if properties.get("start_date", False) and properties.get("target_date", False):
if (
datetime.strptime(properties.get("start_date"), "%Y-%m-%d").date()
> datetime.strptime(properties.get("target_date"), "%Y-%m-%d").date()
):
return Response(
{
"error_code": 4100,
"error_message": "INVALID_ISSUE_DATES",
},
status=status.HTTP_400_BAD_REQUEST,
)
for issue in issues:
# Priority
if properties.get("priority", False):
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"priority": properties.get("priority")}
),
"current_instance": json.dumps(
{"priority": (issue.priority)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.priority = properties.get("priority")
# State
if properties.get("state_id", False):
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"state": properties.get("state")}
),
"current_instance": json.dumps(
{"state": str(issue.state_id)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.state_id = properties.get("state_id")
# Start date
if properties.get("start_date", False):
if (
issue.target_date
and not properties.get("target_date", False)
and issue.target_date
<= datetime.strptime(
properties.get("start_date"), "%Y-%m-%d"
).date()
):
return Response(
{
"error_code": 4101,
"error_message": "INVALID_ISSUE_START_DATE",
},
status=status.HTTP_400_BAD_REQUEST,
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"start_date": properties.get("start_date")}
),
"current_instance": json.dumps(
{"start_date": str(issue.start_date)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.start_date = properties.get("start_date")
# Target date
if properties.get("target_date", False):
if (
issue.start_date
and not properties.get("start_date", False)
and issue.start_date
>= datetime.strptime(
properties.get("target_date"), "%Y-%m-%d"
).date()
):
return Response(
{
"error_code": 4102,
"error_message": "INVALID_ISSUE_TARGET_DATE",
},
status=status.HTTP_400_BAD_REQUEST,
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"target_date": properties.get("target_date")}
),
"current_instance": json.dumps(
{"target_date": str(issue.target_date)}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
issue.target_date = properties.get("target_date")
bulk_update_issues.append(issue)
# Labels
if properties.get("label_ids", []):
for label_id in properties.get("label_ids", []):
bulk_update_issue_labels.append(
IssueLabel(
issue=issue,
label_id=label_id,
created_by=request.user,
project_id=project_id,
workspace_id=workspace_id,
)
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{"label_ids": properties.get("label_ids", [])}
),
"current_instance": json.dumps(
{
"label_ids": [
str(label.id)
for label in issue.labels.all()
]
}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
# Assignees
if properties.get("assignee_ids", []):
for assignee_id in properties.get(
"assignee_ids", issue.assignees
):
bulk_update_issue_assignees.append(
IssueAssignee(
issue=issue,
assignee_id=assignee_id,
created_by=request.user,
project_id=project_id,
workspace_id=workspace_id,
)
)
bulk_issue_activities.append(
{
"type": "issue.activity.updated",
"requested_data": json.dumps(
{
"assignee_ids": properties.get(
"assignee_ids", []
)
}
),
"current_instance": json.dumps(
{
"assignee_ids": [
str(assignee.id)
for assignee in issue.assignees.all()
]
}
),
"issue_id": str(issue.id),
"actor_id": str(request.user.id),
"project_id": str(project_id),
"epoch": epoch,
}
)
# Bulk update all the objects
Issue.objects.bulk_update(
bulk_update_issues,
[
"priority",
"start_date",
"target_date",
"state",
],
batch_size=100,
)
# Create new labels
IssueLabel.objects.bulk_create(
bulk_update_issue_labels,
ignore_conflicts=True,
batch_size=100,
)
# Create new assignees
IssueAssignee.objects.bulk_create(
bulk_update_issue_assignees,
ignore_conflicts=True,
batch_size=100,
)
# update the issue activity
[
issue_activity.delay(**activity)
for activity in bulk_issue_activities
]
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -16,22 +16,21 @@ from plane.app.serializers import (
IssueCommentSerializer,
CommentReactionSerializer,
)
from plane.app.permissions import ProjectLitePermission
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import (
IssueComment,
ProjectMember,
CommentReaction,
Project,
Issue,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
class IssueCommentViewSet(BaseViewSet):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue_comment"
permission_classes = [
ProjectLitePermission,
]
filterset_fields = [
"issue__id",
@@ -66,7 +65,31 @@ class IssueCommentViewSet(BaseViewSet):
.distinct()
)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def create(self, request, slug, project_id, issue_id):
project = Project.objects.get(pk=project_id)
issue = Issue.objects.get(pk=issue_id)
if (
ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
member=request.user,
role=5,
is_active=True,
).exists()
and not project.guest_view_all_features
and not issue.created_by == request.user
):
return Response(
{"error": "You are not allowed to comment on the issue"},
status=status.HTTP_400_BAD_REQUEST,
)
serializer = IssueCommentSerializer(data=request.data)
if serializer.is_valid():
serializer.save(
@@ -90,6 +113,11 @@ class IssueCommentViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission(
allowed_roles=[ROLE.ADMIN],
creator=True,
model=IssueComment,
)
def partial_update(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug,
@@ -121,6 +149,9 @@ class IssueCommentViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission(
allowed_roles=[ROLE.ADMIN], creator=True, model=IssueComment
)
def destroy(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug,
@@ -150,9 +181,6 @@ class IssueCommentViewSet(BaseViewSet):
class CommentReactionViewSet(BaseViewSet):
serializer_class = CommentReactionSerializer
model = CommentReaction
permission_classes = [
ProjectLitePermission,
]
def get_queryset(self):
return (
@@ -170,6 +198,13 @@ class CommentReactionViewSet(BaseViewSet):
.distinct()
)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def create(self, request, slug, project_id, comment_id):
serializer = CommentReactionSerializer(data=request.data)
if serializer.is_valid():
@@ -192,6 +227,13 @@ class CommentReactionViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission(
[
ROLE.ADMIN,
ROLE.MEMBER,
ROLE.GUEST,
]
)
def destroy(self, request, slug, project_id, comment_id, reaction_code):
comment_reaction = CommentReaction.objects.get(
workspace__slug=slug,

View File

@@ -1,410 +0,0 @@
# Python imports
import json
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
Exists,
F,
Func,
OuterRef,
Prefetch,
Q,
UUIDField,
Value,
)
from django.db.models.functions import Coalesce
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
# Third Party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.app.permissions import ProjectEntityPermission
from plane.app.serializers import (
IssueCreateSerializer,
IssueDetailSerializer,
IssueFlatSerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
IssueLink,
IssueReaction,
IssueSubscriber,
Project,
ProjectMember,
)
from plane.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from .. import BaseViewSet
class IssueDraftViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]
serializer_class = IssueFlatSerializer
model = Issue
def get_queryset(self):
return (
Issue.objects.filter(project_id=self.kwargs.get("project_id"))
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(is_draft=True)
.filter(deleted_at__isnull=True)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
).distinct()
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
# Check group and sub group value paginate
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
# group and sub group pagination
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
# Group Paginate
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# List Paginate
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
serializer = IssueCreateSerializer(
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
"default_assignee_id": project.default_assignee_id,
},
)
if serializer.is_valid():
serializer.save(is_draft=True)
# Track the issue
issue_activity.delay(
type="issue_draft.activity.created",
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
issue = (
issue_queryset_grouper(
queryset=self.get_queryset().filter(
pk=serializer.data["id"]
),
group_by=None,
sub_group_by=None,
)
.values(
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_draft",
"archived_at",
)
.first()
)
return Response(issue, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
issue = self.get_queryset().filter(pk=pk).first()
if not issue:
return Response(
{"error": "Issue does not exist"},
status=status.HTTP_404_NOT_FOUND,
)
serializer = IssueCreateSerializer(
issue, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
type="issue_draft.activity.updated",
requested_data=json.dumps(request.data, cls=DjangoJSONEncoder),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("pk", None)),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
IssueSerializer(issue).data,
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, pk=None):
issue = (
self.get_queryset()
.filter(pk=pk)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related(
"issue", "actor"
),
)
)
.prefetch_related(
Prefetch(
"issue_attachment",
queryset=IssueAttachment.objects.select_related("issue"),
)
)
.prefetch_related(
Prefetch(
"issue_link",
queryset=IssueLink.objects.select_related("created_by"),
)
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
workspace__slug=slug,
project_id=project_id,
issue_id=OuterRef("pk"),
subscriber=request.user,
)
)
)
).first()
if not issue:
return Response(
{"error": "The required object does not exist."},
status=status.HTTP_404_NOT_FOUND,
)
serializer = IssueDetailSerializer(issue, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
if issue.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
member=request.user,
role=20,
project_id=project_id,
is_active=True,
).exists()
):
return Response(
{"error": "Only admin or creator can delete the issue"},
status=status.HTTP_403_FORBIDDEN,
)
issue.delete()
issue_activity.delay(
type="issue_draft.activity.deleted",
requested_data=json.dumps({"issue_id": str(pk)}),
actor_id=str(request.user.id),
issue_id=str(pk),
project_id=str(project_id),
current_instance={},
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -11,9 +11,7 @@ from rest_framework import status
# Module imports
from .. import BaseViewSet, BaseAPIView
from plane.app.serializers import LabelSerializer
from plane.app.permissions import (
ProjectMemberPermission,
)
from plane.app.permissions import allow_permission, ProjectBasePermission, ROLE
from plane.db.models import (
Project,
Label,
@@ -25,7 +23,7 @@ class LabelViewSet(BaseViewSet):
serializer_class = LabelSerializer
model = Label
permission_classes = [
ProjectMemberPermission,
ProjectBasePermission,
]
def get_queryset(self):
@@ -45,6 +43,7 @@ class LabelViewSet(BaseViewSet):
@invalidate_cache(
path="/api/workspaces/:slug/labels/", url_params=True, user=False
)
@allow_permission([ROLE.ADMIN])
def create(self, request, slug, project_id):
try:
serializer = LabelSerializer(data=request.data)
@@ -67,17 +66,20 @@ class LabelViewSet(BaseViewSet):
@invalidate_cache(
path="/api/workspaces/:slug/labels/", url_params=True, user=False
)
@allow_permission([ROLE.ADMIN])
def partial_update(self, request, *args, **kwargs):
return super().partial_update(request, *args, **kwargs)
@invalidate_cache(
path="/api/workspaces/:slug/labels/", url_params=True, user=False
)
@allow_permission([ROLE.ADMIN])
def destroy(self, request, *args, **kwargs):
return super().destroy(request, *args, **kwargs)
class BulkCreateIssueLabelsEndpoint(BaseAPIView):
@allow_permission([ROLE.ADMIN])
def post(self, request, slug, project_id):
label_data = request.data.get("label_data", [])
project = Project.objects.get(pk=project_id)

View File

@@ -14,7 +14,7 @@ from .. import BaseViewSet
from plane.app.serializers import IssueLinkSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import IssueLink
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
class IssueLinkViewSet(BaseViewSet):

View File

@@ -12,17 +12,14 @@ from rest_framework import status
# Module imports
from .. import BaseViewSet
from plane.app.serializers import IssueReactionSerializer
from plane.app.permissions import ProjectLitePermission
from plane.app.permissions import allow_permission, ROLE
from plane.db.models import IssueReaction
from plane.bgtasks.issue_activites_task import issue_activity
from plane.bgtasks.issue_activities_task import issue_activity
class IssueReactionViewSet(BaseViewSet):
serializer_class = IssueReactionSerializer
model = IssueReaction
permission_classes = [
ProjectLitePermission,
]
def get_queryset(self):
return (
@@ -40,6 +37,7 @@ class IssueReactionViewSet(BaseViewSet):
.distinct()
)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def create(self, request, slug, project_id, issue_id):
serializer = IssueReactionSerializer(data=request.data)
if serializer.is_valid():
@@ -62,6 +60,7 @@ class IssueReactionViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@allow_permission([ROLE.ADMIN, ROLE.MEMBER, ROLE.GUEST])
def destroy(self, request, slug, project_id, issue_id, reaction_code):
issue_reaction = IssueReaction.objects.get(
workspace__slug=slug,

Some files were not shown because too many files have changed in this diff Show More