Compare commits

...

356 Commits

Author SHA1 Message Date
Aaryan Khandelwal
83298a7a69 chore: generalize page root component 2024-12-18 15:19:48 +05:30
Aaryan Khandelwal
5773c2bde3 chore: gif support for editor (#6219) 2024-12-18 13:17:05 +05:30
M. Palanikannan
e33bae2125 [PE-92] fix: removing readonly collaborative document editor (#6209)
* fix: removing readonly editor

* fix: sync state

* fix: indexeddb sync loader added

* fix: remove node error fixed

* style: page title and checkbox

* chore: removing the syncing logic

* revert: is editable check removed in display message

* fix: editable field optional

* fix: editable removed as optional prop

* fix: extra options import fix

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-12-18 12:58:18 +05:30
Aaryan Khandelwal
580c4b1930 refactor: remove cn helper function from the editor package (#6217) 2024-12-18 12:22:14 +05:30
Vamsi Krishna
ddd4b51b4e fix: labels empty state for drop down (#6216) 2024-12-17 19:14:10 +05:30
Satish Gandham
ede4aad55b - Do not clear temp files that are locked. (#6214)
- Handle edge cases in sync workspace
2024-12-17 17:46:24 +05:30
Akshita Goyal
1a715c98b2 chore: added common component for project activity (#6212)
* chore: added common component for project activity

* fix: added enum

* fix: added enum for initiatives
2024-12-17 17:02:59 +05:30
Vamsi Krishna
8e6d885731 [WEB-2678]feat: added functionality to add labels directly from dropdown (#6211)
* enhancement:added functionality to add features directly from dropdown

* fix: fixed import order

* fix: fixed lint errors
2024-12-17 14:29:56 +05:30
Prateek Shourya
4507802aba refactor: enhance workspace and project wrapper modularity (#6207) 2024-12-16 19:01:37 +05:30
Anmol Singh Bhatia
438cc33046 code refactor and improvement (#6203)
* chore: package code refactoring

* chore: component restructuring and refactor

* chore: comment create improvement
2024-12-16 17:24:50 +05:30
Vamsi Krishna
442b0fd7e5 fix: added project sync after transfer issues (#6200) 2024-12-16 15:15:48 +05:30
Dancia
1119b9dc36 Updated README.md (#6182)
* Updated README.md

* minor fixes

* minor fixes
2024-12-16 14:33:08 +05:30
Manish Gupta
47a76f48b4 fix: separated docker compose environment variables (#5575)
* Separated environment variables for specific app containers.

* updated env

* cleanup

* Separated environment variables for specific app containers.

* updated env

* cleanup

---------

Co-authored-by: Akshat Jain <akshatjain9782@gmail.com>
2024-12-16 13:23:33 +05:30
Manish Gupta
a0f03d07f3 chore: Check github releases for upgrades (#6162)
* modifed action and install.sh for selfhost

* updated selfhost readme and install.sh

* fixes

* changes suggested by code-rabbit

* chore: updated powered by (#6160)

* improvement: update fetch map during workspace-level module fetch to reduce redundant API calls (#6159)

* fix: remove unwanted states fetching logic to avoid multiple API calls. (#6158)

* chore remove unnecessary CTA (#6161)

* fix: build branch workflow upload artifacts

* fixes

* changes suggested by code-rabbit

* modifed action and install.sh for selfhost

* updated selfhost readme and install.sh

* fix: build branch workflow upload artifacts

* fixes

* changes suggested by code-rabbit

---------

Co-authored-by: guru_sainath <gurusainath007@gmail.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: rahulramesha <71900764+rahulramesha@users.noreply.github.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-12-16 13:22:23 +05:30
Nikhil
74b2ec03ff feat: add language support (#6205) 2024-12-15 11:04:03 +05:30
guru_sainath
5908998127 [WEB-2854] chore: trigger issue_description_version task on issue create and update (#6202)
* chore: issue description version task trigger from issue create and update

* chore: add default value in prop
2024-12-13 22:30:29 +05:30
guru_sainath
df6a80e7ae chore: add sync jobs for issue_version and issue_description_version tables (#6199)
* chore: added fields in issue_version and profile tables and created a new sticky table

* chore: removed point in issue version

* chore: add imports in init

* chore: added sync jobs for issue_version and issue_description_version

* chore: removed logs

* chore: updated logginh

---------

Co-authored-by: sainath <sainath@sainaths-MacBook-Pro.local>
2024-12-13 17:48:55 +05:30
guru_sainath
6ff258ceca chore: Add fields to issue_version and profile tables, and create new sticky table (#6198)
* chore: added fields in issue_version and profile tables and created a new sticky table

* chore: removed point in issue version

* chore: add imports in init

---------

Co-authored-by: sainath <sainath@sainaths-MacBook-Pro.local>
2024-12-13 17:30:25 +05:30
Saurabhkmr98
a8140a5f08 chore: Add logger package for node server side apps (#6188)
* chore: Add logger as a package

* chore: Add logger package for node server side apps

* remove plane logger import in web

* resolve pr reviews and add client logger with readme update

* fix: transformation and added middleware for logging requests

* chore: update readme

* fix: env configurable max file size

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-12-13 14:32:56 +05:30
Prateek Shourya
9234f21f26 [WEB-2848] improvement: enhanced components modularity (#6196)
* improvement: enhanced componenets modularity

* fix: lint errors resolved
2024-12-13 14:26:26 +05:30
Bavisetti Narayan
ab11e83535 [WEB-2843] chore: updated the cycle end date logic (#6194)
* chore: updated the cycle end date logic

* chore: changed the key for timezone
2024-12-13 13:34:07 +05:30
Akshita Goyal
b4112358ac [WEB-2688] chore: added icons and splitted issue header (#6195)
* chore: added icons and splitted issue header

* fix: added ee filler component

* fix: component name fixed

* fix: removed dupes

* fix: casing
2024-12-13 13:31:13 +05:30
Aaryan Khandelwal
77239ebcd4 fix: GitHub casing across the platform (#6193) 2024-12-13 02:22:46 +05:30
Prateek Shourya
54f828cbfa refactor: enhance components modularity and introduce new UI componenets (#6192)
* feat: add navigation dropdown component

* chore: enhance title/ description loader and componenet modularity

* chore: issue store filter update

* chore: added few icons to ui package

* chore: improvements for tabs componenet

* chore: enhance sidebar modularity

* chore: update issue and router store to add support for additional issue layouts

* chore: enhanced cycle componenets modularity

* feat: added project grouping header for cycles list

* chore: enhanced project dropdown componenet by adding multiple selection functionality

* chore: enhanced rich text editor modularity by taking members ids as props for mentions

* chore: added functionality to filter disabled layouts in issue-layout dropdown

* chore: added support to pass project ids as props in project card list

* feat: multi select project modal

* chore: seperate out project componenet for reusability

* chore: command pallete store improvements

* fix: build errors
2024-12-12 21:40:57 +05:30
Bavisetti Narayan
9ad8b43408 chore: handled the cycle date time using project timezone (#6187)
* chore: handled the cycle date time using project timezone

* chore: reverted the frontend commit
2024-12-12 14:11:12 +05:30
Prateek Shourya
38e8a5c807 fix: command palette build (#6186) 2024-12-11 18:19:09 +05:30
Prateek Shourya
a9bd2e243a refactor: enhance command palette modularity (#6139)
* refactor: enhance command palette modularity

* chore: minor updates to command palette store
2024-12-11 18:02:58 +05:30
Vamsi Krishna
ca0d50b229 fix: no activity while moving inbox issues (#6185) 2024-12-11 17:57:27 +05:30
Vamsi Krishna
7fca7fd86c [WEB-2774] fix:favorites reorder (#6179)
* fix:favorites reorder

* chore: added error handling

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

---------

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2024-12-11 16:29:39 +05:30
Prateek Shourya
0ac68f2731 improvement: refactored issue grouping logic to access MobX store directly (#6134)
* improvement: refactored issue grouping logic to access MobX store directly

* chore: minor updates
2024-12-11 15:14:15 +05:30
rahulramesha
5a9ae66680 chore: Remove shouldIgnoreDependencies flags while dragging in timeline view (#6150)
* remove shouldEnable dependency flags for timeline view

* chore: error handling

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
2024-12-11 13:43:48 +05:30
Vamsi Krishna
134644fdf1 [WEB-2382]chore:notification files restructuring (#6181)
* chore: adjusted  increment/decrement  for unread count

* chore: improved param handling for unread notification count function

* chore:file restructuring

* fix:notification types

* chore:file restructuring

* chore:modified notfication types

* chore: modified types for notification

* chore:removed redundant checks for id
2024-12-11 13:41:19 +05:30
sriram veeraghanta
d0f3987aeb fix: instance changelog url updated 2024-12-10 21:03:44 +05:30
sriram veeraghanta
f06b1b8c4a fix: updated package version 2024-12-10 21:02:29 +05:30
sriram veeraghanta
6e56ea4c60 fix: updated changelog url in apiserver 2024-12-10 20:28:51 +05:30
Anmol Singh Bhatia
216a69f991 chore: workspace draft and inbox issue local db mutation (#6180) 2024-12-10 19:12:24 +05:30
Vihar Kurama
205395e079 fix: changed checkboxes to toggles on notifications settings page (#6175) 2024-12-10 01:02:34 +05:30
Bavisetti Narayan
ff8bbed6f9 chore: changed the soft deletion logic (#6171) 2024-12-09 20:29:30 +05:30
Vamsi Krishna
d04619477b [WEB-2382]chore: notifications code improvement (#6172)
* chore: adjusted  increment/decrement  for unread count

* chore: improved param handling for unread notification count function
2024-12-09 18:06:56 +05:30
sriram veeraghanta
547c138084 fix: ui package module resolution 2024-12-09 15:56:20 +05:30
Anmol Singh Bhatia
5c907db0e2 [WEB-2818] chore: project navigation items code refactor (#6170)
* chore: project navigation items code refactor

* fix: build error

* chore: code refactor

* chore: code refactor
2024-12-09 14:37:04 +05:30
Aaryan Khandelwal
a85e592ada fix: creating a new sub-issue from workspace level (#6169) 2024-12-09 12:15:10 +05:30
sriram veeraghanta
b21d190ce0 fix: added github pull request template 2024-12-09 02:55:09 +05:30
sriram veeraghanta
cba41e0755 fix: upgrading the express version 2024-12-09 02:35:48 +05:30
sriram veeraghanta
02308eeb15 fix: django version upgrade 2024-12-09 02:28:06 +05:30
guru_sainath
9ee41ece98 fix: email check validation to handle case in-sensitive email (#6168) 2024-12-07 17:55:50 +05:30
Vamsi Krishna
666ddf73b6 [WEB-2382]chore:notification snooze modal (#6164)
* modified notification store

* notification snooze types fix

* handled promise

* modified notifications layout

* incresed pagination count for notifications
2024-12-06 16:27:45 +05:30
Satish Gandham
4499a5fa25 Sync issues and workspace data when the issue properties like labels/modules/cycles etc are deleted from the project (#6165) 2024-12-06 16:27:07 +05:30
sriram veeraghanta
727dd4002e fix: updated lint command in packages 2024-12-06 15:00:11 +05:30
sriram veeraghanta
4b5a2bc4e5 chore: lint related changes and packaging fixes (#6163)
* fix: lint related changes and packaging fixes

* adding color validations
2024-12-06 14:56:49 +05:30
sriram veeraghanta
b1c340b199 fix: build branch workflow upload artifacts 2024-12-05 16:51:20 +05:30
rahulramesha
a612a17d28 chore remove unnecessary CTA (#6161) 2024-12-05 16:37:55 +05:30
Prateek Shourya
d55ee6d5b8 fix: remove unwanted states fetching logic to avoid multiple API calls. (#6158) 2024-12-05 15:26:35 +05:30
Prateek Shourya
aa1e192a50 improvement: update fetch map during workspace-level module fetch to reduce redundant API calls (#6159) 2024-12-05 15:26:15 +05:30
guru_sainath
6cd8af1092 chore: updated powered by (#6160) 2024-12-05 15:12:37 +05:30
rahulramesha
66652a5d71 refactor project states to ake way for new features (#6156) 2024-12-05 12:46:51 +05:30
sriram veeraghanta
3bccda0c86 chore: formatting and typo fixes 2024-12-04 19:40:37 +05:30
sriram veeraghanta
fb3295f5f4 fix: sites opengraph title and description added 2024-12-04 17:58:23 +05:30
sriram veeraghanta
fa3aa362a9 fix: lint errors 2024-12-04 17:22:41 +05:30
Bavisetti Narayan
b73ea37798 chore: improve the cascading logic (#6152) 2024-12-04 16:15:57 +05:30
Vamsi Krishna
d537e560e3 [WEB-2802]fix: dorpdown visibility issue in safari (#6151)
* filters drop down fix safari

* added comments for translation

* fixed drop down visibility issue
2024-12-04 15:27:34 +05:30
guru_sainath
1b92a18ef8 chore: updated the ssr rendering on sites (#6145)
* fix: refactoring

* fix: site ssr implementation

* chore: fixed auto reload on file change in sites

* chore: updated constant imports and globalised powerBy component

* chore: resolved lint and updated the env

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-12-04 14:24:53 +05:30
rahulramesha
31b6d52417 fix root issue store to have updated url params at all times (#6147) 2024-12-04 13:57:33 +05:30
Vamsi Krishna
a153de34d6 fixed piority icons shape (#6144) 2024-12-04 13:57:14 +05:30
Aaryan Khandelwal
64a44f4fce style: add custom class to editor paragraph and heading blocks (#6143) 2024-12-04 13:43:52 +05:30
guru_sainath
bb8a156bdd fix: removed changelog endpoint (#6146) 2024-12-04 13:42:15 +05:30
Akshita Goyal
f02a2b04a5 fix: export btn overlap issue (#6149) 2024-12-04 13:41:48 +05:30
Bavisetti Narayan
b6ab853c57 chore: filter out the removed cycle from issue detail (#6138) 2024-12-03 16:48:14 +05:30
Aaryan Khandelwal
fe43300aa7 fix: pages empty state authorization (#6141) 2024-12-03 14:53:02 +05:30
Prateek Shourya
849d9891d2 chore: community edition product updates link (#6132)
* chore: community edition product updates link

* fix: iframe embed for changelog

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-12-03 13:28:28 +05:30
Vamsi Krishna
2768f560ad [WEB-2802]fix:filters drop down fix safari (#6133)
* filters drop down fix safari

* added comments for translation
2024-12-03 12:51:39 +05:30
Anmol Singh Bhatia
fe5999ceff fix: intake issue permission (#6136) 2024-12-02 19:49:09 +05:30
rahulramesha
da0071256f fix half block dragging (#6135) 2024-12-02 19:30:58 +05:30
M. Palanikannan
3c6006d04a [PE-31] feat: Add lock unlock archive restore realtime sync (#5629)
* fix: add lock unlock archive restore realtime sync

* fix: show only after editor loads

* fix: added strong types

* fix: live events fixed

* fix: remove unused vars and logs

* fix: converted objects to enum

* fix: error handling and removing the events in read only mode

* fix: added check to only update if the image aspect ratio is not present already

* fix: imports

* fix: props order

* revert: no need of these changes anymore

* fix: updated type names

* fix: order of things

* fix: fixed types and renamed variables

* fix: better typing for the real time updates

* fix: trying multiplexing our socket connection

* fix: multiplexing socket connection in read only editor as well

* fix: remove single socket logic

* fix: fixing the cleanup deps for the provider and localprovider

* fix: add a better data structure for managing events

* chore: refactored realtime events into hooks

* feat: fetch page meta while focusing tabs

* fix: cycling through items on slash command item in down arrow

* fix: better naming convention for realtime events

* fix: simplified localprovider initialization and cleaning

* fix: types from ui

* fix: abstracted away from exposing the provider directly

* fix: coderabbit suggestions

* regression: pass user in dependency array

* fix: removed page action api calls by the other users the document is synced with

* chore: removed unused imports
2024-12-02 14:26:36 +05:30
Aaryan Khandelwal
8c04aa6f51 dev: revamp pages authorization (#6094) 2024-12-02 13:59:01 +05:30
Aaryan Khandelwal
9f14167ef5 refactor: editor code splitting (#6102)
* fix: merge conflicts resolved from preview

* fix: space app build errors

* fix: product updates modal

* fix: build errors

* fix: lite text read only editor

* refactor: additional options push logic
2024-12-02 13:51:27 +05:30
Aaryan Khandelwal
11bfbe560a fix: checked colored todo list item (#6113) 2024-12-02 13:47:50 +05:30
Aaryan Khandelwal
fc52936024 fix: escape markdown content for images (#6096) 2024-12-02 13:36:12 +05:30
Vamsi Krishna
5150c661ab reduced the components moved (#6110) 2024-12-02 13:35:40 +05:30
Vamsi Krishna
63bc01f385 [WEB-2774]fix:reordering favorites and favorite folders (#6119)
* fixed re order for favorites

* fixed lint errors

* added reorder

* fixed reorder inside folder

* fixed lint issues

* memoized reorder

* removed unnecessary comments

* seprated duplicate logic to a common file

* removed code comments

* fixed favorite remove while reorder inside folder

* fixed folder remove while reorder inside folder

* fixed-reorder issue

* added last child to drop handled

* fixed orderby function

* removed unncessasary comments
2024-12-02 13:35:09 +05:30
Anmol Singh Bhatia
1953d6fe3a [WEB-2762] chore: loader code refactor (#5992)
* chore: loader code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor
2024-12-02 13:24:01 +05:30
Anmol Singh Bhatia
1b9033993d [WEB-2799] chore: global component and code refactor (#6131)
* chore: local storage helper hook added to package

* chore: tabs global component added

* chore: collapsible button improvement

* chore: linear progress indicator improvement

* chore: fill icon set added to package
2024-12-02 13:22:08 +05:30
sriram veeraghanta
75ada1bfac fix: constants package updates 2024-12-01 21:26:35 +05:30
Prateek Shourya
d0f9a4d245 chore: add redirection to plane logo in invitations page (#6125) 2024-11-29 20:20:49 +05:30
sriram veeraghanta
05894c5b9c Merge pull request #6121 from makeplane/preview
release: v0.24.0
2024-11-29 19:36:12 +05:30
Prateek Shourya
5926c9e8e9 fix: comment images in profile activity page (#6123) 2024-11-29 19:20:31 +05:30
Prateek Shourya
5aeedd1e5a [WEB-2610] fix: workspace redirection from admin app (#6122) 2024-11-29 19:02:13 +05:30
sriram veeraghanta
7725b200f7 fix: changelog redirection 2024-11-29 18:13:29 +05:30
sriram veeraghanta
2c69538617 fix: hypermode text typo changes 2024-11-29 17:47:46 +05:30
pablohashescobar
41bd98dd63 fix: instance collect 2024-11-29 17:41:06 +05:30
sriram veeraghanta
bf1c326b44 Merge branch 'preview' of github.com:makeplane/plane into preview 2024-11-29 17:36:00 +05:30
sriram veeraghanta
3d1485461d fix: lockfile udpated 2024-11-29 17:35:47 +05:30
rahulramesha
4251b114c3 chore: enable no load by default (#5968)
* enable no load by default

* remove help section brackets

* fallback to server with mentions
2024-11-29 14:55:39 +05:30
Prateek Shourya
712339a638 minor improvements for workspace management (#6099)
* minor improvements for workspace management

* typo fix
2024-11-29 14:53:30 +05:30
sriram veeraghanta
1c9162e1f1 chore: turbo version upgrade 2024-11-29 14:40:14 +05:30
sriram veeraghanta
f1e6f59716 chore: package version updated 2024-11-29 14:37:53 +05:30
sriram veeraghanta
69f235ed24 fix: merge conflicts 2024-11-29 14:35:43 +05:30
Vamsi Krishna
4aa01ffebe [WEB-2795]chore:removed header links for project bread crumb inside project detail and list (#6116)
* removed header links for project bread crumb inside project detail

* Add total issue count while syncing project to telemetry

---------

Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
2024-11-29 11:39:44 +05:30
Bavisetti Narayan
41c0ba502c fix: intake toggle (#6111) 2024-11-28 16:58:21 +05:30
Bavisetti Narayan
378e896bf0 fix: notification count (#6109) 2024-11-28 12:58:09 +05:30
Prateek Shourya
e3799c8a40 fix: add back issue identifier for relation activity. (#6106) 2024-11-28 12:50:56 +05:30
sriram veeraghanta
0d70397639 chore: issue version migrations updates 2024-11-28 12:42:30 +05:30
sriram veeraghanta
d2758fe5e6 Revert "fix: refactor editor extensions code spliting"
This reverts commit 234513278f.
2024-11-27 18:20:41 +05:30
Bavisetti Narayan
1420b7e7d3 chore: restrict email notifications for removed users (#6100) 2024-11-27 15:06:55 +05:30
Prateek Shourya
05d3e3ae45 feat: workspace management from admin app (#6093)
* feat: workspace management from admin app

* chore: UI and UX copy improvements

* chore: ux copy improvements
2024-11-26 23:57:41 +05:30
Prateek Shourya
9dbb2b26c3 fix: issue activity sort order componenet import (#6098) 2024-11-26 20:49:39 +05:30
Vamsi Krishna
fa2e60101f [WEB-2774] Chore: re-ordering functionality for entities in favorites. (#6078)
* fixed re order for favorites

* fixed lint errors

* added reorder

* fixed reorder inside folder

* fixed lint issues

* memoized reorder

* removed unnecessary comments

* seprated duplicate logic to a common file

* removed code comments
2024-11-26 19:15:21 +05:30
Satish Gandham
6376a09318 - Change batch size to 50 for inserting issues (#6085)
- Fallback to server when mentions filter is used
- Split load workspace into multiple transactions
2024-11-26 19:12:39 +05:30
Vamsi Krishna
32048be26f [WEB-2432]fix: project not found state and error page alignment (#6095)
* fixed error page alignment and projects empty page

* spelling corrected

* spelling corrected
2024-11-26 19:11:35 +05:30
Vamsi Krishna
f09e37fed8 [WEB - 2779] feat: Added sort order for issue activity (#6087)
* added sort order for issue activity

* fixed invalid date generation issue

* fixed lint errors, optimized code
2024-11-26 18:58:01 +05:30
sriram veeraghanta
31c761db25 fix: nivo charts update fixes (#6080) 2024-11-26 18:52:42 +05:30
Aaryan Khandelwal
f7b2cee418 fix: misalignment of swimlanes group header (#6077) 2024-11-26 18:51:46 +05:30
Vamsi Krishna
1d9b02b085 [WEB-2724] fix: custom properties issue while moving to project (#6090)
* fixed custom properties adding issue

* added error handling to function
2024-11-26 18:50:28 +05:30
sriram veeraghanta
84c5e70181 chore: upgrade turbo repo version 2024-11-26 18:14:28 +05:30
sriram veeraghanta
234513278f fix: refactor editor extensions code spliting 2024-11-26 18:08:32 +05:30
Nikhil
76fe136d85 fix: project join for admin and members (#6097)
* chore: add enum role comparison

* chore: add member also to join a project
2024-11-26 16:58:41 +05:30
sriram veeraghanta
c4a5c5973f fix: tracer error handling 2024-11-26 15:30:53 +05:30
sriram veeraghanta
89819a9473 fix: workflow fixes 2024-11-26 15:13:58 +05:30
sriram veeraghanta
182aa58f6c fix: tracer init fixes 2024-11-26 15:11:54 +05:30
Anmol Singh Bhatia
7469e67b71 fix: project view application error (#6091) 2024-11-25 20:05:03 +05:30
sriram veeraghanta
1cb16bf176 fix: email error handling on magic auth 2024-11-25 15:02:50 +05:30
Bavisetti Narayan
ca88675dbf chore: added dates in issue export (#6088)
* chore: added dates in issue export

* chore: added date converter
2024-11-22 19:59:08 +05:30
Nikhil
86f8743ade chore: remove exists checks (#6086) 2024-11-22 17:00:20 +05:30
Nikhil
1a6ec7034a chore: management command to add user to a project (#6084) 2024-11-22 16:05:58 +05:30
Bavisetti Narayan
42d6078f60 [WEB-2776] fix: restrict notifications (#6081)
* chore: restrict notifications

* chore: handled the issue filter duplicates

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-11-22 16:02:11 +05:30
Bavisetti Narayan
6ef62820fa [WEB-2778] chore: private project join restriction (#6082)
* chore: private project join restriction

* chore: update project not found container layout

* chore: restrict other users to join private project

* chore: add check condition using enum

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-11-22 16:00:19 +05:30
sriram veeraghanta
b72d18079f fix: adding start and target date in issue exporter 2024-11-21 19:24:39 +05:30
sriram veeraghanta
a42c69f619 chore: pyporject toml changes 2024-11-21 18:00:02 +05:30
sriram veeraghanta
0dbd4cfe97 chore: formatting changes 2024-11-21 17:42:44 +05:30
Anmol Singh Bhatia
a446bc043e [WEB-2765] fix: issue detail page unnecessary scroll (#6068)
* fix: issue dertail page unnecessary scroll

* fix: issue detail sidebar ui
2024-11-21 15:16:47 +05:30
sriram veeraghanta
daed58be0f fix: adding new restricted workspace slugs 2024-11-20 20:36:53 +05:30
pablohashescobar
ca91d5909b chore: formatting errors 2024-11-20 13:00:13 +05:30
pablohashescobar
3bea2e8d1b chore: fix instance apis 2024-11-20 12:35:13 +05:30
sriram veeraghanta
1325064676 fix: typo and naming conventions 2024-11-20 00:32:30 +05:30
sriram veeraghanta
a01a371767 fix: typo fixes 2024-11-20 00:00:04 +05:30
sriram veeraghanta
2d60337eac fix: celery timestamp changes 2024-11-19 20:17:53 +05:30
pablohashescobar
f3ac26e5c9 chore: instances 2024-11-19 19:47:13 +05:30
Aaryan Khandelwal
d5a55de17a fix: cover image update fix for project and user profile (#6075)
* fix: cover image update payload

* fix: cover image assets

* chore: add gif support

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-11-19 18:28:53 +05:30
Prateek Shourya
6f497b024b [WEB-2770] fix: inbox issue detail loader on focus change (#6074) 2024-11-19 17:07:32 +05:30
Nikhil
a3e8ee6045 fix: remove caching for user based apis to handle avatar uploads (#6072) 2024-11-19 15:42:10 +05:30
sriram veeraghanta
c1ac6e4244 chore: removing dependabot updates alerts 2024-11-18 12:06:13 +05:30
dependabot[bot]
6d98619082 chore(deps): bump actions/checkout from 3 to 4 (#6005)
Bumps [actions/checkout](https://github.com/actions/checkout) from 3 to 4.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 19:42:08 +05:30
dependabot[bot]
52d3169542 chore(deps): bump softprops/action-gh-release from 2.0.8 to 2.1.0 (#6010)
Bumps [softprops/action-gh-release](https://github.com/softprops/action-gh-release) from 2.0.8 to 2.1.0.
- [Release notes](https://github.com/softprops/action-gh-release/releases)
- [Changelog](https://github.com/softprops/action-gh-release/blob/master/CHANGELOG.md)
- [Commits](https://github.com/softprops/action-gh-release/compare/v2.0.8...v2.1.0)

---
updated-dependencies:
- dependency-name: softprops/action-gh-release
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 19:40:41 +05:30
dependabot[bot]
5989b1a134 chore(deps): bump github/codeql-action from 2 to 3 (#6011)
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 2 to 3.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/github/codeql-action/compare/v2...v3)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 19:39:31 +05:30
sriram veeraghanta
291bb5c899 Merge branch 'preview' of github.com:makeplane/plane into preview 2024-11-16 19:37:22 +05:30
sriram veeraghanta
2ef00efaab fix: tubro repo upgrade 2024-11-16 19:37:06 +05:30
dependabot[bot]
c5f96466e9 chore(deps): bump cross-spawn in the npm_and_yarn group (#6038)
Bumps the npm_and_yarn group with 1 update: [cross-spawn](https://github.com/moxystudio/node-cross-spawn).


Updates `cross-spawn` from 7.0.3 to 7.0.5
- [Changelog](https://github.com/moxystudio/node-cross-spawn/blob/master/CHANGELOG.md)
- [Commits](https://github.com/moxystudio/node-cross-spawn/compare/v7.0.3...v7.0.5)

---
updated-dependencies:
- dependency-name: cross-spawn
  dependency-type: indirect
  dependency-group: npm_and_yarn
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 18:40:01 +05:30
sriram veeraghanta
35938b57af fix: dependabot security patch only 2024-11-16 18:36:47 +05:30
dependabot[bot]
1b1b160c04 chore(deps): bump docker/build-push-action from 5.1.0 to 6.9.0 (#6004)
Bumps [docker/build-push-action](https://github.com/docker/build-push-action) from 5.1.0 to 6.9.0.
- [Release notes](https://github.com/docker/build-push-action/releases)
- [Commits](https://github.com/docker/build-push-action/compare/v5.1.0...v6.9.0)

---
updated-dependencies:
- dependency-name: docker/build-push-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-11-16 18:30:44 +05:30
sriram veeraghanta
4149e84e62 Create dependabot.yml (#6002) 2024-11-16 18:25:29 +05:30
Aaryan Khandelwal
9408e92e44 Revert "[WEB-1435] dev: conflict free issue descriptions (#5912)" (#6000)
This reverts commit e9680cab74.
2024-11-15 17:13:31 +05:30
Aaryan Khandelwal
e9680cab74 [WEB-1435] dev: conflict free issue descriptions (#5912)
* chore: new description binary endpoints

* chore: conflict free issue description

* chore: fix submitting status

* chore: update yjs utils

* chore: handle component re-mounting

* chore: update buffer response type

* chore: add try catch for issue description update

* chore: update buffer response type

* chore: description binary in retrieve

* chore: update issue description hook

* chore: decode description binary

* chore: migrations fixes and cleanup

* chore: migration fixes

* fix: inbox issue description

* chore: move update operations to the issue store

* fix: merge conflicts

* chore: reverted the commit

* chore: removed the unwanted imports

* chore: remove unnecessary props

* chore: remove unused services

* chore: update live server error handling

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-11-15 16:38:58 +05:30
sriram veeraghanta
229610513a fix: django instrumentation fixes 2024-11-13 21:04:16 +05:30
sriram veeraghanta
f9d9c92c83 fix: opentelemetry sdk package update 2024-11-13 20:27:47 +05:30
Aaryan Khandelwal
89588d4451 fix: issue and module link validation (#5994)
* fix: issue and module link validation

* chore: removed reset logic
2024-11-13 19:47:30 +05:30
Akshita Goyal
3eb911837c fix: display property in take (#5993) 2024-11-13 18:02:24 +05:30
rahulramesha
4b50b27a74 [WEB-2442] feat: Minor Timeline view Enhancements (#5987)
* fix timeline scroll to the right in some cases

(cherry picked from commit 17043a6c7f)

* add get position based on Date

(cherry picked from commit 2fbe22d689)

* Add sticky block name to enable it to be read throughout the block regardless of scroll position

(cherry picked from commit 447af2e05a)

* Enable blocks to have a single date on the block charts

(cherry picked from commit cb055d566b)

* revert back date-range changes

* change gradient of half blocks on Timeline

* Add instance Id for Timeline Sidebar dragging to avoid enabling dropping of other drag instances

* fix timeline scrolling height
2024-11-13 15:40:37 +05:30
rahulramesha
f44db89f41 [WEB-2628] fix: Sorting by estimates (#5988)
* fix estimates sorting in Front end side

* change estimate sorting keys

* - Fix estimate sorting when local db is enabled
- Fix a bug with with sorting on special fields on spreadsheet layout
- Cleanup logging

* Add logic for order by based on layout for special cases of no load

---------

Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
2024-11-13 15:38:43 +05:30
Akshita Goyal
8c3189e1be fix: intake status count (#5990) 2024-11-13 15:38:03 +05:30
sriram veeraghanta
eee2145734 fix: code spliting and instance maintenance screens 2024-11-12 19:48:31 +05:30
Aaryan Khandelwal
106710f3d0 fix: custom background color for table header (#5989) 2024-11-12 15:26:57 +05:30
Anmol Singh Bhatia
db8c4f92e8 chore: theme and code refactor (#5983)
* chore: added pi colors

* chore: de-dupe modal height

---------

Co-authored-by: gakshita <akshitagoyal1516@gmail.com>
2024-11-11 19:53:43 +05:30
Anmol Singh Bhatia
a6cc2c93f8 chore: worklog enhancements (#5982) 2024-11-11 19:27:07 +05:30
Bavisetti Narayan
0428ea06f6 chore: filter the deleted issue assignee (#5984) 2024-11-11 19:25:38 +05:30
Aaryan Khandelwal
7082f7014d style: remove unnecessary bottom padding from the rich text editor (#5976) 2024-11-11 16:11:34 +05:30
Anmol Singh Bhatia
c7c729d81b [WEB-2283] fix: create issue modal parent select ui (#5980)
* fix: create issue modal parent select ui

* chore: code refactor
2024-11-11 16:11:10 +05:30
Aaryan Khandelwal
97eb8d43d4 style: updated margins and font styles for editor (#5978)
* style: updated margins and font styles for editor

* fix: code block font size in small font

* fix: remove duplicate code
2024-11-11 16:10:47 +05:30
Anmol Singh Bhatia
1217af1d5f chore: restrict sub-issue to have different project id than parent (#5981) 2024-11-11 16:10:27 +05:30
Bavisetti Narayan
13083a77eb chore: enable intake from project settings (#5977) 2024-11-09 17:01:21 +05:30
Akshita Goyal
0cd36b854e fix: intake loading (#5966)
* fix: intake loading

* fix: image upload in space
2024-11-08 17:17:15 +05:30
Bavisetti Narayan
1d314dd25f fix: renamed inbox to intake (#5967)
* feat: intake

* chore: intake model migration changes

* dev: update dummy data

* dev: add duplicate apis for inbox

* dev: fix external apis

* fix: external apis

* chore: migration file changes

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-11-08 17:10:24 +05:30
rahulramesha
1743717351 fix related to activity (#5972) 2024-11-08 17:09:49 +05:30
Satish Gandham
acba451803 [WEB-2706] fix: Add fallback when db initialisation fails (#5973)
* Add fallback when db initialization fails

* add checks for instance.exec

* chore: convert issue boolean fields to actual boolean value.

* change instance exec code

* sync issue to local db when inbox issue is accepted and draft issue is moved to project

* chore: added project and workspace keys

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-11-08 17:09:26 +05:30
Aaryan Khandelwal
2193e8c79c fix: editor user config (#5974) 2024-11-08 13:30:06 +05:30
Anmol Singh Bhatia
4c6ab984c3 [WEB-2742] chore: issue link ui revamp (#5971)
* chore-issue-link-ui

* chore: issue link ui revamp
2024-11-07 19:24:15 +05:30
Prateek Shourya
7574206a41 [WEB-2554] improvement: dashboard sidebar list items. (#5970) 2024-11-07 15:31:28 +05:30
Anmol Singh Bhatia
eebc327b10 chore: app sidebar behaviour (#5964) 2024-11-06 18:36:23 +05:30
Prateek Shourya
e19cb012be [WEB-2728] improvement: add true-transparent variant for textarea. (#5960) 2024-11-06 16:56:15 +05:30
guru_sainath
9d1253a61d chore: infra update for maintenance mode (#5963) 2024-11-06 15:13:51 +05:30
Bavisetti Narayan
56755b0e9c chore: intake migration (#5950)
* chore: intake migration

* chore: removed the enum

* chore: removed the source type enum

* chore: changed the migration file
2024-11-05 19:21:20 +05:30
rahulramesha
438d1bcfbd add missing config to get issues api call (#5955) 2024-11-05 17:50:23 +05:30
Akshita Goyal
45a5cf5119 fix: editor height (#5953)
* fix: editor height

* fix: removed unwanted class

* fix: editor height
2024-11-05 17:47:39 +05:30
Aaryan Khandelwal
b4de055463 [PULSE-42] feat: text alignment for all editors (#5847)
* feat: text alignment for editors

* fix: text alignment types

* fix: build errors

* fix: build error

* fix: toolbar movement post alignment selection

* fix: callout type

* fix: image node types

* chore: add ts error warning
2024-11-05 17:46:34 +05:30
Aaryan Khandelwal
bb311b750f fix: wrong token being passed in the read-only editor (#5954)
* fix: wrong token

* chore: update useMemo dependencies
2024-11-05 17:45:53 +05:30
Anmol Singh Bhatia
ea8583b2d4 chore: code refactor (#5952)
* chore: code refactor

* chore: code refactor
2024-11-05 17:04:03 +05:30
Akshita Goyal
eed2ca77ef fix: added workspaceslug in renderChildren of project settings (#5951)
* fix: added workspaceslug in renderChildren of project settings

* fix: updated apis

* fix: types

* fix: added editor

* fix: handled avatar for intake
2024-11-05 16:07:27 +05:30
Akshita Goyal
9309d1b574 feat: Pi chat (#5933)
* fix: added pi chat

* fix: added bot

* fix: removed pi chat from community version

* fix: removed unwanted files

* fix: removed unused import
2024-11-05 15:16:58 +05:30
Aaryan Khandelwal
f205d72782 fix: floating toolbar max width (#5949) 2024-11-04 20:17:20 +05:30
rahulramesha
3d2fe7841f fix issues fetching while changing filters by making sure to pass the abort controller config to apis (#5948) 2024-11-04 20:16:56 +05:30
rahulramesha
71589f93ca [WEB-2442] fix : Timeline layout bugs (#5946)
* fix relation creation and removal for Issue relations

* fix Scrolling to block when the block is beyond current chart's limits

* fix dark mode for timeline layout

* use a hook to get the current relations available in the environment, instead of directly importing it

* Update relation activity for all the relations
2024-11-04 16:55:38 +05:30
Satish Gandham
a1bfde6af9 [WEB-2706] fix: Fix issue with SQLite transactions (#5934)
* - Fix transaction within transaction issue
- Close DB handles on reload
- Fix GET_ISSUES tracking

* Cleanup stray code

* Fix lint error

* Possible fix for NoModificationAllowedError
2024-11-04 16:54:13 +05:30
Lakhan Baheti
20b2a70939 fix: global css conflict (#5945) 2024-11-04 16:15:17 +05:30
Prateek Shourya
914811b643 fix: build error for product updates modal. (#5944) 2024-11-04 14:04:59 +05:30
Nikhil
0dead39fd1 chore: device migration (#5939)
* chore: device migration

* chore: devices

* chore: update device migrations

* chore: update migration

* chore: update migrations

* chore: update device migrations
2024-11-01 22:40:39 +05:30
sriram veeraghanta
27d7d91185 fix: new set of migrations in db models 2024-11-01 21:24:57 +05:30
Lakhan Baheti
3696062372 [WEB-2730] chore: core/editor updates to support mobile editor (#5910)
* added editor changes w.r.t mobile-editor

* added external extensions option

* fix: type errors in image block

* added on transaction method

* fix: optional prop fixed

* fix: memoize the extensions array

* fix: added missing deps

* fix: image component types

* fix: remove range prop

* fix: type fixes and better names of img src

* fix: image load blinking

* fix: code review

* fix: props code review

* fix: coderabbit review

---------

Co-authored-by: Palanikannan M <akashmalinimurugu@gmail.com>
2024-10-30 17:39:02 +05:30
Lakhan Baheti
8ea34b5995 [WEB-2729] chore: updated live server auth cookies handling (#5913)
* chore: updated live server auth cookies handling

* chore: update token parsing logic

* fix: types and better logical seperation between the existing two tokens

* fix: better fallback to use request headers for cookies

---------

Co-authored-by: Palanikannan M <akashmalinimurugu@gmail.com>
2024-10-30 17:38:29 +05:30
Bavisetti Narayan
403482fa6e fix: workspace user property migration (#5908)
* fix: workspace user property migration

* fix: issue relations migration
2024-10-30 13:52:14 +05:30
Nikhil
fe18eae8cd fix: integrity error on account creation (#5876)
* fix: integrity error on account creation

* fix: exception handling
2024-10-30 13:46:05 +05:30
rahulramesha
3f429a1dab minor build fix (#5929) 2024-10-29 20:51:56 +05:30
Ketan Sharma
22b616b03c [WEB-2449] fix: admin is not able to edit issues in notifications peek overview (#5877)
* fix backend

* fix missing arguments for allow permissions

* Revert "fix backend"

This reverts commit 208636d7c8.
2024-10-29 19:46:20 +05:30
Anmol Singh Bhatia
57eb08c8a2 chore: code refactoring (#5928)
* chore: de dupe code splitting

* chore: code refactor
2024-10-29 19:39:55 +05:30
Prateek Shourya
4bc751b7ab [WEB-2500] feat: Product updates modal (What's new in Plane) (#5690)
* [WEB-2500] feat: Product updates modal (What's new in Plane)

* fix: build errors.

* fix: lint errors resolved.

* chore: minor improvements.

* chore: minor fixes
2024-10-29 19:26:00 +05:30
Aaryan Khandelwal
c423d7d9df [WEB-2717] chore: implemented issue attachment upload progress (#5901)
* chore: added attachment upload progress

* chore: add debounce while updating the upload status

* chore: update percentage calc logic

* chore: update debounce interval
2024-10-29 19:22:29 +05:30
rahulramesha
538e78f135 refactor timeline store for code splitting (#5926) 2024-10-29 17:57:45 +05:30
Aaryan Khandelwal
b4bbe3a8ba fix: change html tag name for callout (#5924) 2024-10-29 14:12:12 +05:30
Prateek Shourya
b67f352b90 fix: lint and build errors (#5923)
* fix: lint errors.

* fix: build errors
2024-10-29 13:45:18 +05:30
Anmol Singh Bhatia
8829575780 chore: app sidebar add issue button improvement (#5921) 2024-10-29 13:42:42 +05:30
rahulramesha
724adeff5c [WEB-2442] fix: Timeline Improvements and bug fixes (#5922)
* improve auto scroller logic

* fix drag indicator visibility on for blocks

* modify timeline store logic and improve timeline scrolling logic

* fix width of block while dragging with left handle

* fix block arrow direction while block is out of viewport
2024-10-29 13:42:14 +05:30
rahulramesha
a88a39fb1e [WEB-2442] feat: Revamp Timeline Layout (#5915)
* chore: added issue relations in issue listing

* chore: added pagination for issue detail endpoint

* chore: bulk date update endpoint

* chore: appended the target date

* chore: issue relation new types defined

* fix: order by and issue filters

* fix: passed order by in pagination

* chore: changed the key for issue dates

* Revamp Timeline Layout

* fix block dragging

* minor ui fixes

* improve auto scroll UX

* remove unused import

* fix timeline layout heights

* modify base timeline store

* Segregate issue relation types

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-28 18:03:31 +05:30
Aaryan Khandelwal
f986bd83fd fix: callout content not being saved in description html (#5920) 2024-10-28 17:00:32 +05:30
Satish Gandham
6113aefde0 Fix issue with SQLite transactions (#5919) 2024-10-28 14:14:57 +05:30
Bavisetti Narayan
6d08cf2757 fix: rendered the analytics for labels (#5906)
* fix: rendered the analytics for labels

* fix: analytics exports
2024-10-24 20:35:27 +05:30
Bavisetti Narayan
2caf23fb71 fix: background task metadata (#5909) 2024-10-24 20:35:05 +05:30
Bavisetti Narayan
b33328dec5 fix: issue retrieval endpoint (#5907) 2024-10-24 20:33:16 +05:30
Aaryan Khandelwal
14b31e3fcd [PULSE-36] feat: callout component for pages and issue descriptions (#5856)
* feat: editor callouts

* chore: backspace action updated

* chore: update callout attributes types

* chore: revert emoji picker changes

* chore: removed class atrribute

* chore: added sanitization for local storage values

* chore: disable emoji picker search
2024-10-24 15:36:38 +05:30
Satish Gandham
9fb353ef54 [WEB-2706] chore: Switch to wa-sqlite (#5859)
* fix layout switching when filter is not yet completely fetched

* add layout in issue filter params

* Handle cases when DB intilization failed

* chore: permission layer and updated issues v1 query from workspace to project level

* - Switch to using wa-sqlite instead of sqlite-wasm

* Code cleanup and fix indexes

* Add missing files

* - Import only required functions from sentry
- Wait till all the tables are created

* Skip workspace sync if one is already in progress.

* Sync workspace without using transaction

* Minor cleanup

* Close DB connection before deleting files
Fix clear OPFS on safari

* Fix type issue

* Improve issue insert performance

* Refactor workspace sync

* Close the DB connection while switching workspaces

* Update web/core/local-db/worker/db.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Worker cleanup and error handling

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update web/core/local-db/worker/db.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update web/core/local-db/storage.sqlite.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Update web/core/local-db/worker/db.ts

Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>

* Code cleanup

* Set default order by to created at and descending

* Wait for transactions to complete.

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
2024-10-24 15:35:02 +05:30
Ketan Sharma
ad25a972a1 [WEB-2587] fix: hide log work button for guest user (#5787)
* fix the rendering logic

* fix handle nullish value
2024-10-24 14:48:59 +05:30
Ketan Sharma
4157f3750b add missing background color (#5789) 2024-10-24 14:46:56 +05:30
Ketan Sharma
d7c5645948 [WEB-2606] fix: project members shouldn't be able to change others roles (#5802)
* [WEB-2606] fix: project members should not be able to change other project member's roles

* add better logic
2024-10-24 14:46:10 +05:30
Anmol Singh Bhatia
8d837eddb3 chore: calendar current date indicator improvement (#5880) 2024-10-24 14:42:44 +05:30
Anmol Singh Bhatia
0312455d66 fix: project state setting dnd (#5881) 2024-10-24 14:41:35 +05:30
Prateek Shourya
e4e83a947a [WEB-2479] fix: merge default and archived issue details endpoint. (#5882) 2024-10-24 14:40:50 +05:30
Akshita Goyal
2ecc379486 fix: truncated project name in analytics dropdown (#5883) 2024-10-24 14:39:32 +05:30
Prateek Shourya
bf220666dd [WEB-2326] fix: issue activity mutation on attachments upload. (#5886) 2024-10-24 14:36:30 +05:30
Anmol Singh Bhatia
074ad6d1a4 chore: intake issue back date snooze disabled (#5888) 2024-10-24 14:35:57 +05:30
Bavisetti Narayan
4b815f3769 fix: issue attachment uploads (#5904) 2024-10-23 21:04:10 +05:30
Anmol Singh Bhatia
56bb6e1f48 fix: draft issue type update outside click (#5902) 2024-10-23 20:11:28 +05:30
Bavisetti Narayan
5afa686a21 chore: issue attachment deletion (#5903) 2024-10-23 20:11:01 +05:30
Anmol Singh Bhatia
25a410719b fix: intake issue description and navigation (#5900) 2024-10-23 16:46:28 +05:30
Anmol Singh Bhatia
cbfcbba5d1 [WEB-2709] chore: intake issue navigation improvement (#5891)
* chore: intake issue navigation improvement

* chore: code refactor

* chore: intake issue navigation improvement

* chore: intake issue navigation improvement
2024-10-23 15:19:43 +05:30
Anmol Singh Bhatia
c4421f5f97 fix: issue widget modal rendering (#5896) 2024-10-23 15:19:26 +05:30
Anmol Singh Bhatia
84c06c4713 fix: guest user intake issue edit validation (#5898) 2024-10-23 15:19:10 +05:30
Bavisetti Narayan
6df98099f5 chore: filter the deleted issues stats (#5893) 2024-10-22 20:51:11 +05:30
Bavisetti Narayan
295f094916 chore: changed the annotate for cycle id (#5892) 2024-10-22 19:02:05 +05:30
Akshita Goyal
d859ab9c39 [WEB-2708] fix: intake module and cycle addition fixed (#5890)
* fix: intake module and cycle addition fixed

* chore: fixed the search endpoint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-22 17:59:07 +05:30
Anmol Singh Bhatia
36b868e375 [WEB-2707] fix: draft issue module update and code refactor (#5889)
* chore: draft issue module update

* chore: code refactor
2024-10-22 16:16:29 +05:30
Aaryan Khandelwal
4c20be6cf2 [PE-68] fix: markdown transformation of mention and custom image components (#5864)
* fix: markdown content for mention and custom image extensions

* style: update issue embed upgrade card

* chore: added string escapes
2024-10-22 14:29:50 +05:30
Bavisetti Narayan
7bf4620bc1 chore: soft deletion of cycle and module (#5884)
* chore: soft deletion of cycle and module

* chore: cycle module soft delete

* chore: added the deletion task

* chore: updated the env example

* chore: cycle issue unique constraints

* chore: udpated the Q operator
2024-10-22 14:21:26 +05:30
Nikhil
00eff43f4d fix: bucket policy script to handle error conditions (#5887)
* fix: bucket policy script to handle error conditions

* dev: handle edge cases
2024-10-22 14:19:43 +05:30
sriram veeraghanta
3d3f1b8f74 fix: typescript version consistency 2024-10-22 14:13:28 +05:30
sriram veeraghanta
b87516b0be chore: fixing inconsistent dependencies across the platform (#5885)
* chore: fixing inconsistent dependies across the platform

* fix: fixing peer dependencies

* chore: yarn lock regeneration
2024-10-22 14:03:34 +05:30
Anmol Singh Bhatia
8a1d3c4cf9 chore: urgent priority icon improvement (#5879) 2024-10-22 13:25:22 +05:30
Akshita Goyal
0f25f39404 WEB-2381 Chore: intake refactor (#5752)
* chore: intake emails and forms

* fix: moved files to ee

* fix: intake form ui

* fix: settings apis integrated

* fix: removed publish api

* fix: removed space app

* fix: lint issue

* fix: removed logs

* fix: removed comment

* fix: improved success image
2024-10-22 12:09:03 +05:30
sriram veeraghanta
fb49644185 fix: renaming the action and formatting 2024-10-21 19:26:16 +05:30
Nikhil
b745a29454 fix: credential sending for file uploads (#5869) 2024-10-21 17:46:46 +05:30
M. Palanikannan
c940a2921e fix: validation of public and private assets (#5878) 2024-10-21 15:59:44 +05:30
Anmol Singh Bhatia
6f8df3279c [WEB-2681] fix: module progress indicator (#5842)
* fix: module progress indicator

* fix: module progress indicator
2024-10-21 15:48:35 +05:30
Prateek Shourya
b833e3b10c [WEB-2674] chore: open parent issues in peek-overview from the parent badge. (#5872)
* [WEB-2674] chore: open parent issues in peek-overview from the parent badge.

* chore: remove `_blank` target from ControlLink.
2024-10-21 14:20:00 +05:30
M. Palanikannan
5a0dc4a65a [PE-69] fix: image restoration fixed for new images in private bucket (#5839)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity

* fix: restore only on error

* fix: restore image by handling it inside the image component

* fix: image restoration fixed and aspect ratio added to old images to stop updates on load

* fix: added back restoring logic for public images

* fix: add conditions

* fix: image attributes types

* fix: return for old images

* fix: remove passive false

* fix: eslint fixes

* fix: stopping infinite loading scenarios while restoring from error
2024-10-21 14:17:05 +05:30
Ketan Sharma
e866571e04 fix backend (#5875) 2024-10-21 13:07:36 +05:30
Bavisetti Narayan
3c3fc7cd6d chore: draft issue listing (#5874) 2024-10-21 13:02:20 +05:30
Bavisetti Narayan
db919420a7 [WEB-2693] chore: removed the deleted cycles from the issue list (#5868)
* chore: added the deleted cycles from list

* chore: removed the extra annotation

* chore: removed the frontend comment
2024-10-18 15:48:34 +05:30
M. Palanikannan
2982cd47a9 fix: remoteImageSrc to come from resolved source (#5867) 2024-10-18 14:21:07 +05:30
M. Palanikannan
81550ab5ef [PE-56] regression: image aspect ratio fix (#5792)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity
2024-10-18 13:40:39 +05:30
Bavisetti Narayan
07402efd79 chore: filtered the deleted labels and modules (#5860) 2024-10-18 13:20:32 +05:30
Prateek Shourya
46302f41bc fix: improvements for project types. (#5857) 2024-10-18 11:08:07 +05:30
Ketan Sharma
9530884c59 fix the logic (#5807) 2024-10-17 17:08:49 +05:30
Prateek Shourya
173b49b4cb [WEB-2431] chore: profile settings page UI improvement (#5838)
* [WEB-2431] chore: timezone and language management.

* chore: remove project level timezone changes.

* chore: minor UI improvement.

* chore: minor improvements
2024-10-17 17:06:22 +05:30
Anmol Singh Bhatia
e581ac890e chore: workspace collaborators improvements (#5846) 2024-10-17 17:05:21 +05:30
Anmol Singh Bhatia
a7b58e4a93 [WEB-2625] chore: workspace favorite and draft improvement (#5855)
* chore: favorite empty state updated

* chore: added draft issue count in workspace members

* chore: workspace draft count improvement

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-17 17:02:25 +05:30
Bavisetti Narayan
d552913171 chore: updated queryset for soft delete (#5844) 2024-10-17 17:01:26 +05:30
Bavisetti Narayan
b6a7e45e8d chore: added draft cycle and module in draft issue (#5854) 2024-10-17 13:35:13 +05:30
Aaryan Khandelwal
6209aeec0b fix: color extension not working on issue description and published page (#5852)
* fix: color extension not working

* chore: update types
2024-10-17 13:26:23 +05:30
Anmol Singh Bhatia
1099c59b83 fix: draft issue empty state flicker (#5848) 2024-10-17 12:55:32 +05:30
Nikhil
9b2ffaaca8 fix: draft issue asset conversion to issue (#5849) 2024-10-17 12:51:13 +05:30
sriram veeraghanta
aa93cca7bf fix: workflow fixes 2024-10-16 21:07:01 +05:30
sriram veeraghanta
1191f74bfe fix: workflow fixes 2024-10-16 20:08:25 +05:30
sriram veeraghanta
fbd1f6334a fix: workflow fixes 2024-10-16 20:05:10 +05:30
Anmol Singh Bhatia
7d36d63eb1 [WEB-2682] fix: delete project mutation and workspace draft header validation (#5843)
* fix: workspace draft header action validation

* fix: delete project mutation
2024-10-16 16:13:26 +05:30
Nikhil
9b85306359 dev: move storage metadata collection to background job (#5818)
* fix: move storage metadata collection to background job

* fix: docker compose and env

* fix: archive endpoint
2024-10-16 13:55:49 +05:30
guru_sainath
cc613e57c9 chore: delete deprecated tables (#5833)
* migration: external source and id for issues

* fix: cleaning up deprecated favorite tables

* fix: removing deprecated models

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-10-16 00:33:57 +05:30
Bavisetti Narayan
6e63af7ca9 [WEB-2626] chore: removed the deleted issue's count (#5837)
* chore: removed the deleted issue count

* chore: added issue manager in burn down
2024-10-16 00:30:44 +05:30
guru_sainath
5f9af92faf fix: attachment_count in issue pagination v2 endpoint (#5840)
* fix: attachemnt_count in the issue pagination v2 endpoint

* fix: string comparision in description check in params
2024-10-15 23:46:57 +05:30
Anmol Singh Bhatia
4e70e894f6 chore: workspace draft issue type (#5836) 2024-10-15 18:59:22 +05:30
Anmol Singh Bhatia
ff090ecf39 fix: workspace draft move to project (#5834) 2024-10-15 17:14:56 +05:30
Akshita Goyal
645a261493 fix: Added a common dropdown component (#5826)
* fix: Added a common dropdown component

* fix: dropdown

* fix: estimate dropdown

* fix: removed consoles
2024-10-15 15:17:46 +05:30
Prateek Shourya
8d0611b2a7 [WEB-2613] chore: open parent and sibling issue in new tab from peek-overview/ issue detail page. (#5819) 2024-10-15 13:37:52 +05:30
Bavisetti Narayan
3d7d3c8af1 [WEB-2631] chore: changed the cascading logic for soft delete (#5829)
* chore: changed the cascading logic for soft delete

* chore: changed the delete key

* chore: added the key on delete in project base model
2024-10-15 13:30:44 +05:30
Prateek Shourya
662b99da92 [WEB-2577] improvement: use common create/update issue modal for accepting intake issues for consistency (#5830)
* [WEB-2577] improvement: use common create/update issue modal for accepting intake issues for consistency

* fix: lint errors.

* chore: minor UX copy fix.

* chore: minor indentation fix.
2024-10-15 13:11:14 +05:30
Prateek Shourya
fa25a816a7 [WEB-2549] chore: ux copy update for project access. (#5831) 2024-10-15 12:57:29 +05:30
Anmol Singh Bhatia
ee823d215e [WEB-2629] chore: workspace draft issue ux copy updated (#5825)
* chore: workspace draft issue ux copy updated

* chore: workspace draft issue ux copy updated
2024-10-14 17:26:54 +05:30
Akshita Goyal
4b450f8173 fix: moved dropdowns to chart component + added pending icon (#5824)
* fix: moved dropdowns to chart component + added pending icon

* fix: copy changes

* fix: review changes
2024-10-14 17:00:58 +05:30
Anmol Singh Bhatia
36229d92e0 [WEB-2629] fix: workspace draft delete and move mutation (#5822)
* fix: mutation fix

* chore: code refactor

* chore: code refactor

* chore: useWorkspaceIssueProperties added
2024-10-14 16:50:19 +05:30
Anmol Singh Bhatia
cb90810d02 chore: double click action added and code refactor (#5821) 2024-10-14 16:46:08 +05:30
Anmol Singh Bhatia
658542cc62 [WEB-2616] fix: issue widget attachment (#5820)
* fix: issue widget attachment

* chore: comment added
2024-10-14 16:32:31 +05:30
Nikhil
701af734cd fix: export for analytics and csv (#5815) 2024-10-13 02:11:32 +05:30
Nikhil
cf53cdf6ba fix: analytics tab for private bucket (#5814) 2024-10-13 01:27:48 +05:30
Nikhil
6490ace7c7 fix: intake issue (#5813) 2024-10-13 00:44:52 +05:30
Nikhil
0ac406e8c7 fix: private bucket (#5812)
* fix: workspace level issue creation

* dev: add draft issue support, fix your work tab and cache invalidation for workspace level logos

* chore: issue description

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
2024-10-13 00:31:28 +05:30
Aaryan Khandelwal
e404450e1a [WEB-310] regression: generate file url function (#5811)
* fix: generate file url function

* chore: remove unused imports

* chore: replace indexOf logix with startsWith
2024-10-12 23:39:50 +05:30
sriram veeraghanta
7cc86ad4c0 chore: removing unused packages 2024-10-12 01:43:22 +05:30
Anmol Singh Bhatia
3acc9ec133 fix: intake exception error (#5810) 2024-10-11 22:01:39 +05:30
Anmol Singh Bhatia
286ab7f650 fix: workspace draft issues count (#5809) 2024-10-11 21:28:05 +05:30
Aaryan Khandelwal
7e334203f1 [WEB-310] dev: private bucket implementation (#5793)
* chore: migrations and backmigration to move attachments to file asset

* chore: move attachments to file assets

* chore: update migration file to include created by and updated by and size

* chore: remove uninmport errors

* chore: make size as float field

* fix: file asset uploads

* chore: asset uploads migration changes

* chore: v2 assets endpoint

* chore: remove unused imports

* chore: issue attachments

* chore: issue attachments

* chore: workspace logo endpoints

* chore: private bucket changes

* chore: user asset endpoint

* chore: add logo_url validation

* chore: cover image urlk

* chore: change asset max length

* chore: pages endpoint

* chore: store the storage_metadata only when none

* chore: attachment asset apis

* chore: update create private bucket

* chore: make bucket private

* chore: fix response of user uploads

* fix: response of user uploads

* fix: job to fix file asset uploads

* fix: user asset endpoints

* chore: avatar for user profile

* chore: external apis user url endpoint

* chore: upload workspace and user asset actions updated

* chore: analytics endpoint

* fix: analytics export

* chore: avatar urls

* chore: update user avatar instances

* chore: avatar urls for assignees and creators

* chore: bucket permission script

* fix: all user avatr instances in the web app

* chore: update project cover image logic

* fix: issue attachment endpoint

* chore: patch endpoint for issue attachment

* chore: attachments

* chore: change attachment storage class

* chore: update issue attachment endpoints

* fix: issue attachment

* chore: update issue attachment implementation

* chore: page asset endpoints

* fix: web build errors

* chore: attachments

* chore: page asset urls

* chore: comment and issue asset endpoints

* chore: asset endpoints

* chore: attachment endpoints

* chore: bulk asset endpoint

* chore: restore endpoint

* chore: project assets endpoints

* chore: asset url

* chore: add delete asset endpoints

* chore: fix asset upload endpoint

* chore: update patch endpoints

* chore: update patch endpoint

* chore: update editor image handling

* chore: asset restore endpoints

* chore: avatar url for space assets

* chore: space app assets migration

* fix: space app urls

* chore: space endpoints

* fix: old editor images rendering logic

* fix: issue archive and attachment activity

* chore: asset deletes

* chore: attachment delete

* fix: issue attachment

* fix: issue attachment get

* chore: cover image url for projects

* chore: remove duplicate py file

* fix: url check function

* chore: chore project cover asset delete

* fix: migrations

* chore: delete migration files

* chore: update bucket

* fix: build errors

* chore: add asset url in intake attachment

* chore: project cover fix

* chore: update next.config

* chore: delete old workspace logos

* chore: workspace assets

* chore: asset get for space

* chore: update project modal

* chore: remove unused imports

* fix: space app editor helper

* chore: update rich-text read-only editor

* chore: create multiple column for entity identifiers

* chore: update migrations

* chore: remove entity identifier

* fix: issue assets

* chore: update maximum file size logic

* chore: update editor max file size logic

* fix: close modal after removing workspace logo

* chore: update uploaded asstes' status post issue creation

* chore: added file size limit to the space app

* dev: add file size limit restriction on all endpoints

* fix: remove old workspace logo and user avatar

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-10-11 20:13:38 +05:30
Anmol Singh Bhatia
c9580ab794 chore workspace draft issue improvements (#5808) 2024-10-11 19:51:38 +05:30
Aaryan Khandelwal
e7065af358 [WEB-2494] dev: custom text color and background color extensions (#5786)
* dev: created custom text color and background color extensions

* chore: update slash commands icon style

* chore: update constants

* chore: update variables css file selectors
2024-10-11 19:11:39 +05:30
Manish Gupta
74695e561a modified the action name (#5806) 2024-10-11 18:05:53 +05:30
Anmol Singh Bhatia
c9dbd1d5d1 [WEB-2388] chore: theme changes and workspace draft issue total count updated (#5805)
* chore: theme changes and total count updated

* chore: code refactor
2024-10-11 17:57:48 +05:30
Manish Gupta
6200890693 fix: updated branch build action with BUILD/RELEASE options (#5803) 2024-10-11 17:25:25 +05:30
guru_sainath
3011ef9da1 build-error: removed store prop from calendar store (#5801) 2024-10-11 15:53:58 +05:30
Anmol Singh Bhatia
bf7b3229d1 [WEB-2388] fix: workspace draft issues (#5800)
* fix: create issue modal handle close

* fix: workspace level draft issue store update

* chore: count added

* chore: added description html in list endpoint

* fix: workspace draft issue mutation

* fix: workspace draft issue empty state and count

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-10-11 15:23:32 +05:30
rahulramesha
2c96e042c6 fix workspace drafts build (#5798) 2024-10-10 22:59:27 +05:30
M. Palanikannan
c68658d877 [PE-56] fix: image aspect ratio (#5794)
* regression: image aspect ratio fix

* fix: name of variables changed for clarity
2024-10-10 20:53:20 +05:30
rahulramesha
9c2278a810 fix workspace draft build (#5795) 2024-10-10 20:50:43 +05:30
Anmol Singh Bhatia
332d2d5c68 [WEB-2388] dev: workspace draft issues (#5772)
* chore: workspace draft page added

* chore: workspace draft issues services added

* chore: workspace draft issue store added

* chore: workspace draft issue filter store added

* chore: issue rendering

* conflicts: resolved merge conflicts

* conflicts: handled draft issue store

* chore: draft issue modal

* chore: code optimisation

* chore: ui changes

* chore: workspace draft store and modal updated

* chore: workspace draft issue component added

* chore: updated store and workflow in draft issues

* chore: updated issue draft store

* chore: updated issue type cleanup in components

* chore: code refactor

* fix: build error

* fix: quick actions

* fix: update mutation

* fix: create update modal

* chore: commented project draft issue code

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 19:12:34 +05:30
guru_sainath
e9158f820f [WEB-2615] fix: module date validation during chart distribution generation (#5791)
* fix: module date validation while generating the chart distribution

* chore: indentation fix

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 18:33:59 +05:30
sriram veeraghanta
1e1733f6db Merge branch 'master' of github.com:makeplane/plane into preview 2024-10-10 17:24:47 +05:30
Bavisetti Narayan
5573d85d80 chore: only admin's can delete a project (#5790) 2024-10-10 17:24:18 +05:30
sriram veeraghanta
c1f881b2d1 Merge branch 'develop' of github.com:makeplane/plane into preview 2024-10-10 15:11:33 +05:30
sriram veeraghanta
9bab108329 Merge pull request #5788 from makeplane/preview
release: v0.23.1
2024-10-10 15:11:04 +05:30
sriram veeraghanta
5f4875cc60 fix: version bump 2024-10-10 15:05:03 +05:30
sriram veeraghanta
0c1c6dee99 fix: adding scheduled tracing 2024-10-10 14:57:42 +05:30
sriram veeraghanta
1639f34db0 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-10-10 14:07:25 +05:30
Bavisetti Narayan
8a866e440c chore: only admin can changed the project settings (#5766) 2024-10-10 14:06:14 +05:30
Prateek Shourya
7495a7d0cb [WEB-2605] fix: update URL regex pattern to allow complex links. (#5767) 2024-10-10 14:06:14 +05:30
M. Palanikannan
2b1da96c3f fix: drag handle scrolling fixed (#5619)
* fix: drag handle scrolling fixed

* fix: closest scrollable parent found and scrolled

* fix: removed overflow auto from framerenderer

* fix: make dragging dynamic and smoother
2024-10-10 14:06:14 +05:30
Aaryan Khandelwal
daa06f1831 [WEB-2532] fix: custom theme mutation logic (#5685)
* fix: custom theme mutation logic

* chore: update querySelector element
2024-10-10 14:06:14 +05:30
M. Palanikannan
b97fcfb46d fix: show the full screen toolbar in read only instances as well (#5746) 2024-10-10 14:06:14 +05:30
M. Palanikannan
852fc9bac1 [WEB-2603] fix: remove validation of roles from the live server (#5761)
* fix: remove validation of roles from the live server

* chore: remove the service

* fix: remove all validation of authorization

* fix: props updated
2024-10-10 14:06:14 +05:30
Akshita Goyal
55f44e0245 fix: spreadsheet flicker issue (#5769) 2024-10-10 14:06:14 +05:30
Prateek Shourya
8981e52dcc [WEB-2601] improvement: add click to copy issue identifier on peek-overview and issue detail page. (#5760) 2024-10-10 14:06:14 +05:30
Akshita Goyal
d92dbaea72 [WEB-2589] Chore: inbox issue permissions (#5763)
* chore: changed permission in inbox issue

* chore: fixed permissions for intake

* fix: refactoring

* fix: lint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-10 14:06:14 +05:30
dependabot[bot]
58f3d0a68c chore(deps): bump django in /apiserver/requirements (#5781)
Bumps [django](https://github.com/django/django) from 4.2.15 to 4.2.16.
- [Commits](https://github.com/django/django/compare/4.2.15...4.2.16)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-10 14:06:14 +05:30
Akshita Goyal
45880b3a72 [WEB-2589] Chore: inbox issue permissions (#5763)
* chore: changed permission in inbox issue

* chore: fixed permissions for intake

* fix: refactoring

* fix: lint

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-09 17:48:52 +05:30
dependabot[bot]
992adb9794 chore(deps): bump django in /apiserver/requirements (#5781)
Bumps [django](https://github.com/django/django) from 4.2.15 to 4.2.16.
- [Commits](https://github.com/django/django/compare/4.2.15...4.2.16)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-10-09 17:26:33 +05:30
Akshita Goyal
6d78418e79 fix: create cycle function (#5775)
* fix: create cycle function

* chore: draft and cycle version changes

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-10-08 20:01:15 +05:30
Prateek Shourya
6e52f1b434 [WEB-2601] improvement: add click to copy issue identifier on peek-overview and issue detail page. (#5760) 2024-10-08 18:43:13 +05:30
Aaryan Khandelwal
c3c1ea727d [WEB-2494] feat: text color and highlight options for all editors (#5653)
* feat: add text color and highlight options to pages

* style: rich text editor floating toolbar

* chore: remove unused function

* refactor: slash command components

* chore: move default text and background options to the top

* fix: sections filtering logic
2024-10-08 18:42:47 +05:30
Aaryan Khandelwal
5afc576dec refactor: export components (#5773) 2024-10-08 18:41:08 +05:30
Ketan Sharma
50ae32f3e1 [WEB-2555] fix: add "mark all as read" in the notifications header (#5770)
* move mark all as read to header and remove it from dropdown

* made recommended changes
2024-10-08 17:13:35 +05:30
Akshita Goyal
0451593057 fix: spreadsheet flicker issue (#5769) 2024-10-08 17:10:16 +05:30
M. Palanikannan
be092ac99f [WEB-2603] fix: remove validation of roles from the live server (#5761)
* fix: remove validation of roles from the live server

* chore: remove the service

* fix: remove all validation of authorization

* fix: props updated
2024-10-08 16:55:26 +05:30
Anmol Singh Bhatia
f73a603226 [WEB-2380] chore: cycle sidebar refactor (#5759)
* chore: cycle sidebar refactor

* chore: code splitting

* chore: code refactor

* chore: code refactor
2024-10-08 16:54:44 +05:30
Aaryan Khandelwal
b27249486a [PE-45] feat: page export as PDF & Markdown (#5705)
* feat: export page as pdf and markdown

* chore: add image conversion logic
2024-10-08 16:54:02 +05:30
Anmol Singh Bhatia
20c9e232e7 chore: IssueParentDetail added to issue peekoverview (#5751) 2024-10-08 16:53:07 +05:30
Bavisetti Narayan
d168fd4bfa [WEB-2388] fix: workspace draft issues migration (#5749)
* fix: workspace draft issues

* chore: changed the timezone key

* chore: migration changes
2024-10-08 16:51:57 +05:30
M. Palanikannan
7317975b04 fix: show the full screen toolbar in read only instances as well (#5746) 2024-10-08 16:50:32 +05:30
Aaryan Khandelwal
39195d0d89 [WEB-2532] fix: custom theme mutation logic (#5685)
* fix: custom theme mutation logic

* chore: update querySelector element
2024-10-08 16:47:16 +05:30
Mihir
6bf0e27b66 [WEB-2433] chore-Update name of the Layout (#5661)
* Updated layout names

* Corrected character casing for titles
2024-10-08 16:44:50 +05:30
M. Palanikannan
5fb7e98b7c fix: drag handle scrolling fixed (#5619)
* fix: drag handle scrolling fixed

* fix: closest scrollable parent found and scrolled

* fix: removed overflow auto from framerenderer

* fix: make dragging dynamic and smoother
2024-10-08 16:44:05 +05:30
sriram veeraghanta
d97ca68229 Merge pull request #5764 from makeplane/preview
release: v0.23.0
2024-10-07 18:54:49 +05:30
sriram veeraghanta
707570ca7a Merge pull request #5041 from makeplane/preview
release: v0.22-dev
2024-07-05 13:28:45 +05:30
sriram veeraghanta
c76af7d7d6 Merge pull request #4688 from makeplane/preview
release: v0.21-dev
2024-06-03 18:54:06 +05:30
sriram veeraghanta
1dcea9bcc8 Merge pull request #4569 from makeplane/preview
release: v0.20-dev
2024-05-23 19:55:06 +05:30
sriram veeraghanta
da957e06b6 Merge pull request #4349 from makeplane/preview
release: v0.19-dev
2024-05-03 20:36:07 +05:30
sriram veeraghanta
a0b9596cb4 Merge pull request #4239 from makeplane/preview
chore:version update
2024-04-19 12:01:15 +05:30
sriram veeraghanta
f71e8a3a0f Merge pull request #4238 from makeplane/preview
release: v0.18-dev
2024-04-19 11:56:03 +05:30
sriram veeraghanta
002fb4547b Merge pull request #4107 from makeplane/preview
release: v0.17-dev
2024-04-02 20:07:48 +05:30
sriram veeraghanta
c1b1ba35c1 Merge pull request #3878 from makeplane/preview
release: v0.16-dev
2024-03-05 20:04:08 +05:30
sriram veeraghanta
4566d6e80c Merge pull request #3697 from makeplane/preview
release: 0.15.4-dev
2024-02-19 19:30:06 +05:30
sriram veeraghanta
e8d359e625 Merge pull request #3674 from makeplane/preview
fix: build branch docker images push on release
2024-02-15 14:35:32 +05:30
sriram veeraghanta
351eba8d61 Merge pull request #3671 from makeplane/preview
release: peek overview issue description initial load bug (#3670)
2024-02-15 03:25:30 +05:30
sriram veeraghanta
1e27e37b51 Merge pull request #3666 from makeplane/preview
release: v0.15.2-dev
2024-02-14 19:41:55 +05:30
sriram veeraghanta
7df2e9cf11 Merge pull request #3632 from makeplane/preview
release: v0.15.1-dev
2024-02-12 20:59:56 +05:30
sriram veeraghanta
c6e3f1b932 Merge pull request #3535 from makeplane/preview
release: 0.15-dev
2024-02-01 15:01:49 +05:30
1409 changed files with 46298 additions and 26302 deletions

126
.github/actions/build-push-ce/action.yml vendored Normal file
View File

@@ -0,0 +1,126 @@
name: "Build and Push Docker Image"
description: "Reusable action for building and pushing Docker images"
inputs:
docker-username:
description: "The Dockerhub username"
required: true
docker-token:
description: "The Dockerhub Token"
required: true
# Docker Image Options
docker-image-owner:
description: "The owner of the Docker image"
required: true
docker-image-name:
description: "The name of the Docker image"
required: true
build-context:
description: "The build context"
required: true
default: "."
dockerfile-path:
description: "The path to the Dockerfile"
required: true
build-args:
description: "The build arguments"
required: false
default: ""
# Buildx Options
buildx-driver:
description: "Buildx driver"
required: true
default: "docker-container"
buildx-version:
description: "Buildx version"
required: true
default: "latest"
buildx-platforms:
description: "Buildx platforms"
required: true
default: "linux/amd64"
buildx-endpoint:
description: "Buildx endpoint"
required: true
default: "default"
# Release Build Options
build-release:
description: "Flag to publish release"
required: false
default: "false"
build-prerelease:
description: "Flag to publish prerelease"
required: false
default: "false"
release-version:
description: "The release version"
required: false
default: "latest"
runs:
using: "composite"
steps:
- name: Set Docker Tag
shell: bash
env:
IMG_OWNER: ${{ inputs.docker-image-owner }}
IMG_NAME: ${{ inputs.docker-image-name }}
BUILD_RELEASE: ${{ inputs.build-release }}
IS_PRERELEASE: ${{ inputs.build-prerelease }}
REL_VERSION: ${{ inputs.release-version }}
run: |
FLAT_BRANCH_VERSION=$(echo "${{ github.ref_name }}" | sed 's/[^a-zA-Z0-9.-]//g')
if [ "${{ env.BUILD_RELEASE }}" == "true" ]; then
semver_regex="^v([0-9]+)\.([0-9]+)\.([0-9]+)(-[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*)?$"
if [[ ! ${{ env.REL_VERSION }} =~ $semver_regex ]]; then
echo "Invalid Release Version Format : ${{ env.REL_VERSION }}"
echo "Please provide a valid SemVer version"
echo "e.g. v1.2.3 or v1.2.3-alpha-1"
echo "Exiting the build process"
exit 1 # Exit with status 1 to fail the step
fi
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:${{ env.REL_VERSION }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:latest
else
TAG=${{ env.IMG_OWNER }}/${{ env.IMG_NAME }}:${FLAT_BRANCH_VERSION}
fi
echo "DOCKER_TAGS=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ inputs.docker-username }}
password: ${{ inputs.docker-token}}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ inputs.buildx-driver }}
version: ${{ inputs.buildx-version }}
endpoint: ${{ inputs.buildx-endpoint }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Docker Image
uses: docker/build-push-action@v5.1.0
with:
context: ${{ inputs.build-context }}
file: ${{ inputs.dockerfile-path }}
platforms: ${{ inputs.buildx-platforms }}
tags: ${{ env.DOCKER_TAGS }}
push: true
build-args: ${{ inputs.build-args }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ inputs.docker-username }}
DOCKER_PASSWORD: ${{ inputs.docker-token }}

20
.github/pull_request_template.md vendored Normal file
View File

@@ -0,0 +1,20 @@
### Description
<!-- Provide a detailed description of the changes in this PR -->
### Type of Change
<!-- Put an 'x' in the boxes that apply -->
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] Feature (non-breaking change which adds functionality)
- [ ] Improvement (change that would cause existing functionality to not work as expected)
- [ ] Code refactoring
- [ ] Performance improvements
- [ ] Documentation update
### Screenshots and Media (if applicable)
<!-- Add screenshots to help explain your changes, ideally showcasing before and after -->
### Test Scenarios
<!-- Please describe the tests that you ran to verify your changes -->
### References
<!-- Link related issues if there are any -->

View File

@@ -83,7 +83,7 @@ jobs:
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: ./aio
file: ./aio/Dockerfile-base-full
@@ -124,7 +124,7 @@ jobs:
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: ./aio
file: ./aio/Dockerfile-base-slim

View File

@@ -128,7 +128,7 @@ jobs:
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app
@@ -188,7 +188,7 @@ jobs:
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app

View File

@@ -1,29 +1,42 @@
name: Branch Build
name: Branch Build CE
on:
workflow_dispatch:
inputs:
build_type:
description: "Type of build to run"
required: true
type: choice
default: "Build"
options:
- "Build"
- "Release"
releaseVersion:
description: "Release Version"
type: string
default: v0.0.0
isPrerelease:
description: "Is Pre-release"
type: boolean
default: false
required: true
arm64:
description: "Build for ARM64 architecture"
required: false
default: false
type: boolean
push:
branches:
- master
- preview
release:
types: [released, prereleased]
env:
TARGET_BRANCH: ${{ github.ref_name || github.event.release.target_commitish }}
TARGET_BRANCH: ${{ github.ref_name }}
ARM64_BUILD: ${{ github.event.inputs.arm64 }}
IS_PRERELEASE: ${{ github.event.release.prerelease }}
BUILD_TYPE: ${{ github.event.inputs.build_type }}
RELEASE_VERSION: ${{ github.event.inputs.releaseVersion }}
IS_PRERELEASE: ${{ github.event.inputs.isPrerelease }}
jobs:
branch_build_setup:
name: Build Setup
runs-on: ubuntu-latest
runs-on: ubuntu-20.04
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
@@ -36,13 +49,24 @@ jobs:
build_space: ${{ steps.changed_files.outputs.space_any_changed }}
build_web: ${{ steps.changed_files.outputs.web_any_changed }}
build_live: ${{ steps.changed_files.outputs.live_any_changed }}
flat_branch_name: ${{ steps.set_env_variables.outputs.FLAT_BRANCH_NAME }}
dh_img_web: ${{ steps.set_env_variables.outputs.DH_IMG_WEB }}
dh_img_space: ${{ steps.set_env_variables.outputs.DH_IMG_SPACE }}
dh_img_admin: ${{ steps.set_env_variables.outputs.DH_IMG_ADMIN }}
dh_img_live: ${{ steps.set_env_variables.outputs.DH_IMG_LIVE }}
dh_img_backend: ${{ steps.set_env_variables.outputs.DH_IMG_BACKEND }}
dh_img_proxy: ${{ steps.set_env_variables.outputs.DH_IMG_PROXY }}
build_type: ${{steps.set_env_variables.outputs.BUILD_TYPE}}
build_release: ${{ steps.set_env_variables.outputs.BUILD_RELEASE }}
build_prerelease: ${{ steps.set_env_variables.outputs.BUILD_PRERELEASE }}
release_version: ${{ steps.set_env_variables.outputs.RELEASE_VERSION }}
steps:
- id: set_env_variables
name: Set Environment Variables
run: |
if [ "${{ env.TARGET_BRANCH }}" == "master" ] || [ "${{ env.ARM64_BUILD }}" == "true" ] || ([ "${{ github.event_name }}" == "release" ] && [ "${{ env.IS_PRERELEASE }}" != "true" ]); then
if [ "${{ env.ARM64_BUILD }}" == "true" ] || ([ "${{ env.BUILD_TYPE }}" == "Release" ] && [ "${{ env.IS_PRERELEASE }}" != "true" ]); then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
@@ -53,9 +77,43 @@ jobs:
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
fi
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
flat_branch_name=$(echo ${{ env.TARGET_BRANCH }} | sed 's/[^a-zA-Z0-9\._]/-/g')
echo "FLAT_BRANCH_NAME=${flat_branch_name}" >> $GITHUB_OUTPUT
BR_NAME=$( echo "${{ env.TARGET_BRANCH }}" |sed 's/[^a-zA-Z0-9.-]//g')
echo "TARGET_BRANCH=$BR_NAME" >> $GITHUB_OUTPUT
echo "DH_IMG_WEB=plane-frontend" >> $GITHUB_OUTPUT
echo "DH_IMG_SPACE=plane-space" >> $GITHUB_OUTPUT
echo "DH_IMG_ADMIN=plane-admin" >> $GITHUB_OUTPUT
echo "DH_IMG_LIVE=plane-live" >> $GITHUB_OUTPUT
echo "DH_IMG_BACKEND=plane-backend" >> $GITHUB_OUTPUT
echo "DH_IMG_PROXY=plane-proxy" >> $GITHUB_OUTPUT
echo "BUILD_TYPE=${{env.BUILD_TYPE}}" >> $GITHUB_OUTPUT
BUILD_RELEASE=false
BUILD_PRERELEASE=false
RELVERSION="latest"
if [ "${{ env.BUILD_TYPE }}" == "Release" ]; then
FLAT_RELEASE_VERSION=$(echo "${{ env.RELEASE_VERSION }}" | sed 's/[^a-zA-Z0-9.-]//g')
echo "FLAT_RELEASE_VERSION=${FLAT_RELEASE_VERSION}" >> $GITHUB_OUTPUT
semver_regex="^v([0-9]+)\.([0-9]+)\.([0-9]+)(-[a-zA-Z0-9]+(-[a-zA-Z0-9]+)*)?$"
if [[ ! $FLAT_RELEASE_VERSION =~ $semver_regex ]]; then
echo "Invalid Release Version Format : $FLAT_RELEASE_VERSION"
echo "Please provide a valid SemVer version"
echo "e.g. v1.2.3 or v1.2.3-alpha-1"
echo "Exiting the build process"
exit 1 # Exit with status 1 to fail the step
fi
BUILD_RELEASE=true
RELVERSION=$FLAT_RELEASE_VERSION
if [ "${{ env.IS_PRERELEASE }}" == "true" ]; then
BUILD_PRERELEASE=true
fi
fi
echo "BUILD_RELEASE=${BUILD_RELEASE}" >> $GITHUB_OUTPUT
echo "BUILD_PRERELEASE=${BUILD_PRERELEASE}" >> $GITHUB_OUTPUT
echo "RELEASE_VERSION=${RELVERSION}" >> $GITHUB_OUTPUT
- id: checkout_files
name: Checkout Files
@@ -73,24 +131,24 @@ jobs:
admin:
- admin/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
space:
- space/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
web:
- web/**
- packages/**
- 'package.json'
- 'yarn.lock'
- 'tsconfig.json'
- 'turbo.json'
- "package.json"
- "yarn.lock"
- "tsconfig.json"
- "turbo.json"
live:
- live/**
- packages/**
@@ -99,338 +157,225 @@ jobs:
- 'tsconfig.json'
- 'turbo.json'
branch_build_push_web:
if: ${{ needs.branch_build_setup.outputs.build_web == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Web Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
FRONTEND_TAG: makeplane/plane-frontend:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Frontend Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-frontend:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-frontend:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-frontend:latest
else
TAG=${{ env.FRONTEND_TAG }}
fi
echo "FRONTEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0
with:
context: .
file: ./web/Dockerfile.web
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.FRONTEND_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_admin:
if: ${{ needs.branch_build_setup.outputs.build_admin== 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_admin == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Admin Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
ADMIN_TAG: makeplane/plane-admin:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Admin Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-admin:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-admin:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-admin:latest
else
TAG=${{ env.ADMIN_TAG }}
fi
echo "ADMIN_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Frontend to Docker Container Registry
uses: docker/build-push-action@v5.1.0
- name: Admin Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./admin/Dockerfile.admin
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.ADMIN_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_admin }}
build-context: .
dockerfile-path: ./admin/Dockerfile.admin
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_web:
if: ${{ needs.branch_build_setup.outputs.build_web == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Web Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Web Build and Push
uses: ./.github/actions/build-push-ce
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_web }}
build-context: .
dockerfile-path: ./web/Dockerfile.web
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_space:
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_space == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Space Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
SPACE_TAG: makeplane/plane-space:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Space Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-space:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-space:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-space:latest
else
TAG=${{ env.SPACE_TAG }}
fi
echo "SPACE_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Space to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Space Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./space/Dockerfile.space
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.SPACE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
branch_build_push_apiserver:
if: ${{ needs.branch_build_setup.outputs.build_apiserver == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push API Server Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
BACKEND_TAG: makeplane/plane-backend:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Backend Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-backend:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-backend:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-backend:latest
else
TAG=${{ env.BACKEND_TAG }}
fi
echo "BACKEND_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
uses: actions/checkout@v4
- name: Build and Push Backend to Docker Hub
uses: docker/build-push-action@v5.1.0
with:
context: ./apiserver
file: ./apiserver/Dockerfile.api
platforms: ${{ env.BUILDX_PLATFORMS }}
push: true
tags: ${{ env.BACKEND_TAG }}
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_space }}
build-context: .
dockerfile-path: ./space/Dockerfile.space
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_live:
if: ${{ needs.branch_build_setup.outputs.build_live == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_live == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Live Collaboration Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
LIVE_TAG: makeplane/plane-live:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Live Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-live:${{ github.event.release.tag_name }}
if [ "${{ github.event.release.prerelease }}" != "true" ]; then
TAG=${TAG},makeplane/plane-live:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-live:latest
else
TAG=${{ env.LIVE_TAG }}
fi
echo "LIVE_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Build and Push Live Server to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Live Build and Push
uses: ./.github/actions/build-push-ce
with:
context: .
file: ./live/Dockerfile.live
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.LIVE_TAG }}
push: true
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_live }}
build-context: .
dockerfile-path: ./live/Dockerfile.live
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_apiserver:
if: ${{ needs.branch_build_setup.outputs.build_apiserver == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push API Server Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Backend Build and Push
uses: ./.github/actions/build-push-ce
with:
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_backend }}
build-context: ./apiserver
dockerfile-path: ./apiserver/Dockerfile.api
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
branch_build_push_proxy:
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || github.event_name == 'release' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
if: ${{ needs.branch_build_setup.outputs.build_proxy == 'true' || github.event_name == 'workflow_dispatch' || needs.branch_build_setup.outputs.gh_branch_name == 'master' }}
name: Build-Push Proxy Docker Image
runs-on: ubuntu-20.04
needs: [branch_build_setup]
env:
PROXY_TAG: makeplane/plane-proxy:${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
BUILDX_ENDPOINT: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
steps:
- name: Set Proxy Docker Tag
run: |
if [ "${{ github.event_name }}" == "release" ]; then
TAG=makeplane/plane-proxy:${{ github.event.release.tag_name }}
if [ "${{ env.IS_PRERELEASE }}" != "true" ]; then
TAG=${TAG},makeplane/plane-proxy:stable
fi
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
TAG=makeplane/plane-proxy:latest
else
TAG=${{ env.PROXY_TAG }}
fi
echo "PROXY_TAG=${TAG}" >> $GITHUB_ENV
- name: Login to Docker Hub
uses: docker/login-action@v3
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
- name: Proxy Build and Push
uses: ./.github/actions/build-push-ce
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
build-release: ${{ needs.branch_build_setup.outputs.build_release }}
build-prerelease: ${{ needs.branch_build_setup.outputs.build_prerelease }}
release-version: ${{ needs.branch_build_setup.outputs.release_version }}
docker-username: ${{ secrets.DOCKERHUB_USERNAME }}
docker-token: ${{ secrets.DOCKERHUB_TOKEN }}
docker-image-owner: makeplane
docker-image-name: ${{ needs.branch_build_setup.outputs.dh_img_proxy }}
build-context: ./nginx
dockerfile-path: ./nginx/Dockerfile
buildx-driver: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
buildx-version: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
buildx-platforms: ${{ needs.branch_build_setup.outputs.gh_buildx_platforms }}
buildx-endpoint: ${{ needs.branch_build_setup.outputs.gh_buildx_endpoint }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
with:
driver: ${{ env.BUILDX_DRIVER }}
version: ${{ env.BUILDX_VERSION }}
endpoint: ${{ env.BUILDX_ENDPOINT }}
- name: Check out the repo
attach_assets_to_build:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Release' }}
name: Attach Assets to Release
runs-on: ubuntu-20.04
needs: [branch_build_setup]
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Build and Push Plane-Proxy to Docker Hub
uses: docker/build-push-action@v5.1.0
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
- name: Attach Assets
id: attach_assets
uses: actions/upload-artifact@v4
with:
context: ./nginx
file: ./nginx/Dockerfile
platforms: ${{ env.BUILDX_PLATFORMS }}
tags: ${{ env.PROXY_TAG }}
push: true
name: selfhost-assets
retention-days: 2
path: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env
publish_release:
if: ${{ needs.branch_build_setup.outputs.build_type == 'Release' }}
name: Build Release
runs-on: ubuntu-20.04
needs:
[
branch_build_setup,
branch_build_push_admin,
branch_build_push_web,
branch_build_push_space,
branch_build_push_live,
branch_build_push_apiserver,
branch_build_push_proxy,
attach_assets_to_build,
]
env:
REL_VERSION: ${{ needs.branch_build_setup.outputs.release_version }}
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Update Assets
run: |
cp ./deploy/selfhost/install.sh deploy/selfhost/setup.sh
- name: Create Release
id: create_release
uses: softprops/action-gh-release@v2.1.0
env:
DOCKER_BUILDKIT: 1
DOCKER_USERNAME: ${{ secrets.DOCKERHUB_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKERHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
with:
tag_name: ${{ env.REL_VERSION }}
name: ${{ env.REL_VERSION }}
draft: false
prerelease: ${{ env.IS_PRERELEASE }}
generate_release_notes: true
files: |
${{ github.workspace }}/deploy/selfhost/setup.sh
${{ github.workspace }}/deploy/selfhost/restore.sh
${{ github.workspace }}/deploy/selfhost/docker-compose.yml
${{ github.workspace }}/deploy/selfhost/variables.env

View File

@@ -29,11 +29,11 @@ jobs:
steps:
- name: Checkout repository
uses: actions/checkout@v3
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
@@ -46,7 +46,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
uses: github/codeql-action/autobuild@v3
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
@@ -59,6 +59,6 @@ jobs:
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
uses: github/codeql-action/analyze@v3
with:
category: "/language:${{matrix.language}}"

View File

@@ -79,7 +79,7 @@ jobs:
uses: actions/checkout@v4
- name: Build and Push to Docker Hub
uses: docker/build-push-action@v5.1.0
uses: docker/build-push-action@v6.9.0
with:
context: .
file: ./aio/Dockerfile-app

View File

@@ -8,21 +8,20 @@ on:
env:
CURRENT_BRANCH: ${{ github.ref_name }}
TARGET_BRANCH: ${{ vars.SYNC_TARGET_BRANCH_NAME }} # The target branch that you would like to merge changes like develop
TARGET_BRANCH: "preview" # The target branch that you would like to merge changes like develop
GITHUB_TOKEN: ${{ secrets.ACCESS_TOKEN }} # Personal access token required to modify contents and workflows
REVIEWER: ${{ vars.SYNC_PR_REVIEWER }}
ACCOUNT_USER_NAME: ${{ vars.ACCOUNT_USER_NAME }}
ACCOUNT_USER_EMAIL: ${{ vars.ACCOUNT_USER_EMAIL }}
jobs:
Create_PR:
create_pull_request:
runs-on: ubuntu-latest
permissions:
pull-requests: write
contents: write
steps:
- name: Checkout code
uses: actions/checkout@v4.1.1
uses: actions/checkout@v4
with:
fetch-depth: 0 # Fetch all history for all branches and tags
@@ -48,6 +47,6 @@ jobs:
echo "Pull Request already exists: $PR_EXISTS"
else
echo "Creating new pull request"
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $CURRENT_BRANCH --title "sync: community changes" --body "")
PR_URL=$(gh pr create --base $TARGET_BRANCH --head $CURRENT_BRANCH --title "${{ vars.SYNC_PR_TITLE }}" --body "")
echo "Pull Request created: $PR_URL"
fi

View File

@@ -17,7 +17,7 @@ jobs:
contents: read
steps:
- name: Checkout Code
uses: actions/checkout@v4.1.1
uses: actions/checkout@v4
with:
persist-credentials: false
fetch-depth: 0
@@ -35,9 +35,8 @@ jobs:
env:
GH_TOKEN: ${{ secrets.ACCESS_TOKEN }}
run: |
RUN_ID="${{ github.run_id }}"
TARGET_REPO="${{ vars.SYNC_TARGET_REPO }}"
TARGET_BRANCH="sync/${RUN_ID}"
TARGET_BRANCH="${{ vars.SYNC_TARGET_BRANCH_NAME }}"
SOURCE_BRANCH="${{ env.SOURCE_BRANCH_NAME }}"
git checkout $SOURCE_BRANCH

123
README.md
View File

@@ -5,9 +5,7 @@
<img src="https://plane-marketing.s3.ap-south-1.amazonaws.com/plane-readme/plane_logo_.webp" alt="Plane Logo" width="70">
</a>
</p>
<h3 align="center"><b>Plane</b></h3>
<p align="center"><b>Open-source project management that unlocks customer value</b></p>
<h1 align="center"><b>Plane</b></h1>
<p align="center">
<a href="https://discord.com/invite/A92xrEGCge">
@@ -44,79 +42,85 @@ Meet [Plane](https://dub.sh/plane-website-readme), an open-source project manage
> Plane is evolving every day. Your suggestions, ideas, and reported bugs help us immensely. Do not hesitate to join in the conversation on [Discord](https://discord.com/invite/A92xrEGCge) or raise a GitHub issue. We read everything and respond to most.
## Installation
## 🚀 Installation
The easiest way to get started with Plane is by creating a [Plane Cloud](https://app.plane.so) account.
Getting started with Plane is simple. Choose the setup that works best for you:
If you would like to self-host Plane, please see our [deployment guide](https://docs.plane.so/docker-compose).
- **Plane Cloud**
Sign up for a free account on [Plane Cloud](https://app.plane.so)—it's the fastest way to get up and running without worrying about infrastructure.
- **Self-host Plane**
Prefer full control over your data and infrastructure? Install and run Plane on your own servers. Follow our detailed [deployment guides](https://developers.plane.so/self-hosting/overview) to get started.
| Installation methods | Docs link |
| -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| Docker | [![Docker](https://img.shields.io/badge/docker-%230db7ed.svg?style=for-the-badge&logo=docker&logoColor=white)](https://docs.plane.so/self-hosting/methods/docker-compose) |
| Kubernetes | [![Kubernetes](https://img.shields.io/badge/kubernetes-%23326ce5.svg?style=for-the-badge&logo=kubernetes&logoColor=white)](https://docs.plane.so/kubernetes) |
| Docker | [![Docker](https://img.shields.io/badge/docker-%230db7ed.svg?style=for-the-badge&logo=docker&logoColor=white)](https://developers.plane.so/self-hosting/methods/docker-compose) |
| Kubernetes | [![Kubernetes](https://img.shields.io/badge/kubernetes-%23326ce5.svg?style=for-the-badge&logo=kubernetes&logoColor=white)](https://developers.plane.so/self-hosting/methods/kubernetes) |
`Instance admins` can configure instance settings with [God-mode](https://docs.plane.so/instance-admin).
`Instance admins` can manage and customize settings using [God mode](https://developers.plane.so/self-hosting/govern/instance-admin).
## 🚀 Features
## 🌟 Features
- **Issues**: Quickly create issues and add details using a powerful rich text editor that supports file uploads. Add sub-properties and references to problems for better organization and tracking.
- **Issues**
Efficiently create and manage tasks with a robust rich text editor that supports file uploads. Enhance organization and tracking by adding sub-properties and referencing related issues.
- **Cycles**:
Keep up your team's momentum with Cycles. Gain insights into your project's progress with burn-down charts and other valuable features.
- **Cycles**
Maintain your teams momentum with Cycles. Track progress effortlessly using burn-down charts and other insightful tools.
- **Modules**: Break down your large projects into smaller, more manageable modules. Assign modules between teams to track and plan your project's progress easily.
- **Modules**
Simplify complex projects by dividing them into smaller, manageable modules.
- **Views**: Create custom filters to display only the issues that matter to you. Save and share your filters in just a few clicks.
- **Views**
Customize your workflow by creating filters to display only the most relevant issues. Save and share these views with ease.
- **Pages**: Plane pages, equipped with AI and a rich text editor, let you jot down your thoughts on the fly. Format your text, upload images, hyperlink, or sync your existing ideas into an actionable item or issue.
- **Pages**
Capture and organize ideas using Plane Pages, complete with AI capabilities and a rich text editor. Format text, insert images, add hyperlinks, or convert your notes into actionable items.
- **Analytics**: Get insights into all your Plane data in real-time. Visualize issue data to spot trends, remove blockers, and progress your work.
- **Analytics**
Access real-time insights across all your Plane data. Visualize trends, remove blockers, and keep your projects moving forward.
- **Drive** (_coming soon_): The drive helps you share documents, images, videos, or any other files that make sense to you or your team and align on the problem/solution.
## 🛠️ Quick start for contributors
> Development system must have docker engine installed and running.
## 🛠️ Local development
Setting up local environment is extremely easy and straight forward. Follow the below step and you will be ready to contribute -
### Pre-requisites
- Ensure Docker Engine is installed and running.
1. Clone the code locally using:
### Development setup
Setting up your local environment is simple and straightforward. Follow these steps to get started:
1. Clone the repository:
```
git clone https://github.com/makeplane/plane.git
```
2. Switch to the code folder:
2. Navigate to the project folder:
```
cd plane
```
3. Create your feature or fix branch you plan to work on using:
3. Create a new branch for your feature or fix:
```
git checkout -b <feature-branch-name>
```
4. Open terminal and run:
4. Run the setup script in the terminal:
```
./setup.sh
```
5. Open the code on VSCode or similar equivalent IDE.
6. Review the `.env` files available in various folders.
Visit [Environment Setup](./ENV_SETUP.md) to know about various environment variables used in system.
7. Run the docker command to initiate services:
5. Open the project in an IDE such as VS Code.
6. Review the `.env` files in the relevant folders. Refer to [Environment Setup](./ENV_SETUP.md) for details on the environment variables used.
7. Start the services using Docker:
```
docker compose -f docker-compose-local.yml up -d
```
You are ready to make changes to the code. Do not forget to refresh the browser (in case it does not auto-reload).
Thats it! Youre all set to begin coding. Remember to refresh your browser if changes dont auto-reload. Happy contributing! 🎉
Thats it!
## ❤️ Community
The Plane community can be found on [GitHub Discussions](https://github.com/orgs/makeplane/discussions), and our [Discord server](https://discord.com/invite/A92xrEGCge). Our [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) applies to all Plane community chanels.
Ask questions, report bugs, join discussions, voice ideas, make feature requests, or share your projects.
### Repo Activity
![Plane Repo Activity](https://repobeats.axiom.co/api/embed/2523c6ed2f77c082b7908c33e2ab208981d76c39.svg "Repobeats analytics image")
## Built with
[![Next JS](https://img.shields.io/badge/next.js-000000?style=for-the-badge&logo=nextdotjs&logoColor=white)](https://nextjs.org/)<br/>
[![Django](https://img.shields.io/badge/Django-092E20?style=for-the-badge&logo=django&logoColor=green)](https://www.djangoproject.com/)<br/>
[![Node JS](https://img.shields.io/badge/node.js-339933?style=for-the-badge&logo=Node.js&logoColor=white)](https://nodejs.org/en)
## 📸 Screenshots
@@ -165,7 +169,7 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
</a>
</p>
</p>
<p>
<p>
<a href="https://plane.so" target="_blank">
<img
src="https://ik.imagekit.io/w2okwbtu2/Drive_LlfeY4xn3.png?updatedAt=1709298837917"
@@ -176,23 +180,42 @@ Ask questions, report bugs, join discussions, voice ideas, make feature requests
</p>
</p>
## ⛓️ Security
## 📝 Documentation
Explore Plane's [product documentation](https://docs.plane.so/) and [developer documentation](https://developers.plane.so/) to learn about features, setup, and usage.
If you believe you have found a security vulnerability in Plane, we encourage you to responsibly disclose this and not open a public issue. We will investigate all legitimate reports.
## ❤️ Community
Email squawk@plane.so to disclose any security vulnerabilities.
Join the Plane community on [GitHub Discussions](https://github.com/orgs/makeplane/discussions) and our [Discord server](https://discord.com/invite/A92xrEGCge). We follow a [Code of conduct](https://github.com/makeplane/plane/blob/master/CODE_OF_CONDUCT.md) in all our community channels.
## ❤️ Contribute
Feel free to ask questions, report bugs, participate in discussions, share ideas, request features, or showcase your projects. Wed love to hear from you!
There are many ways to contribute to Plane, including:
## 🛡️ Security
- Submitting [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) and [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+) for various components.
- Reviewing [the documentation](https://docs.plane.so/) and submitting [pull requests](https://github.com/makeplane/plane), from fixing typos to adding new features.
- Speaking or writing about Plane or any other ecosystem integration and [letting us know](https://discord.com/invite/A92xrEGCge)!
- Upvoting [popular feature requests](https://github.com/makeplane/plane/issues) to show your support.
If you discover a security vulnerability in Plane, please report it responsibly instead of opening a public issue. We take all legitimate reports seriously and will investigate them promptly. See [Security policy](https://github.com/makeplane/plane/blob/master/SECURITY.md) for more info.
To disclose any security issues, please email us at security@plane.so.
## 🤝 Contributing
There are many ways you can contribute to Plane:
- Report [bugs](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%F0%9F%90%9Bbug&projects=&template=--bug-report.yaml&title=%5Bbug%5D%3A+) or submit [feature requests](https://github.com/makeplane/plane/issues/new?assignees=srinivaspendem%2Cpushya22&labels=%E2%9C%A8feature&projects=&template=--feature-request.yaml&title=%5Bfeature%5D%3A+).
- Review the [documentation](https://docs.plane.so/) and submit [pull requests](https://github.com/makeplane/docs) to improve it—whether it's fixing typos or adding new content.
- Talk or write about Plane or any other ecosystem integration and [let us know](https://discord.com/invite/A92xrEGCge)!
- Show your support by upvoting [popular feature requests](https://github.com/makeplane/plane/issues).
Please read [CONTRIBUTING.md](https://github.com/makeplane/plane/blob/master/CONTRIBUTING.md) for details on the process for submitting pull requests to us.
### Repo activity
![Plane Repo Activity](https://repobeats.axiom.co/api/embed/2523c6ed2f77c082b7908c33e2ab208981d76c39.svg "Repobeats analytics image")
### We couldn't have done this without you.
<a href="https://github.com/makeplane/plane/graphs/contributors">
<img src="https://contrib.rocks/image?repo=makeplane/plane" />
</a>
## License
This project is licensed under the [GNU Affero General Public License v3.0](https://github.com/makeplane/plane/blob/master/LICENSE.txt).

View File

@@ -121,7 +121,12 @@ export const InstanceAIForm: FC<IInstanceAIForm> = (props) => {
<div className="relative inline-flex items-center gap-2 rounded border border-custom-primary-100/20 bg-custom-primary-100/10 px-4 py-2 text-xs text-custom-primary-200">
<Lightbulb height="14" width="14" />
<div>If you have a preferred AI models vendor, please get in touch with us.</div>
<div>
If you have a preferred AI models vendor, please get in{" "}
<a className="underline font-medium" href="https://plane.so/contact">
touch with us.
</a>
</div>
</div>
</div>
</div>

View File

@@ -195,7 +195,7 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
</Button>
<Link
href="/authentication"
className={cn(getButtonStyling("link-neutral", "md"), "font-medium")}
className={cn(getButtonStyling("neutral-primary", "md"), "font-medium")}
onClick={handleGoBack}
>
Go back

View File

@@ -44,7 +44,7 @@ const InstanceGithubAuthenticationPage = observer(() => {
loading: "Saving Configuration...",
success: {
title: "Configuration saved",
message: () => `Github authentication is now ${value ? "active" : "disabled"}.`,
message: () => `GitHub authentication is now ${value ? "active" : "disabled"}.`,
},
error: {
title: "Error",
@@ -67,8 +67,8 @@ const InstanceGithubAuthenticationPage = observer(() => {
<div className="relative container mx-auto w-full h-full p-4 py-4 space-y-6 flex flex-col">
<div className="border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<AuthenticationMethodCard
name="Github"
description="Allow members to login or sign up to plane with their Github accounts."
name="GitHub"
description="Allow members to login or sign up to plane with their GitHub accounts."
icon={
<Image
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}

View File

@@ -191,7 +191,7 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
</Button>
<Link
href="/authentication"
className={cn(getButtonStyling("link-neutral", "md"), "font-medium")}
className={cn(getButtonStyling("neutral-primary", "md"), "font-medium")}
onClick={handleGoBack}
>
Go back

View File

@@ -192,7 +192,7 @@ export const InstanceGoogleConfigForm: FC<Props> = (props) => {
</Button>
<Link
href="/authentication"
className={cn(getButtonStyling("link-neutral", "md"), "font-medium")}
className={cn(getButtonStyling("neutral-primary", "md"), "font-medium")}
onClick={handleGoBack}
>
Go back

View File

@@ -60,7 +60,7 @@ const InstanceAuthenticationPage = observer(() => {
<div className="border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<div className="text-xl font-medium text-custom-text-100">Manage authentication modes for your instance</div>
<div className="text-sm font-normal text-custom-text-300">
Configure authentication modes for your team and restrict sign ups to be invite only.
Configure authentication modes for your team and restrict sign-ups to be invite only.
</div>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
@@ -80,9 +80,11 @@ const InstanceAuthenticationPage = observer(() => {
<ToggleSwitch
value={Boolean(parseInt(enableSignUpConfig))}
onChange={() => {
Boolean(parseInt(enableSignUpConfig)) === true
? updateConfig("ENABLE_SIGNUP", "0")
: updateConfig("ENABLE_SIGNUP", "1");
if (Boolean(parseInt(enableSignUpConfig)) === true) {
updateConfig("ENABLE_SIGNUP", "0");
} else {
updateConfig("ENABLE_SIGNUP", "1");
}
}}
size="sm"
disabled={isSubmitting}
@@ -90,7 +92,7 @@ const InstanceAuthenticationPage = observer(() => {
</div>
</div>
</div>
<div className="text-lg font-medium pt-6">Authentication modes</div>
<div className="text-lg font-medium pt-6">Available authentication modes</div>
<AuthenticationModes disabled={isSubmitting} updateConfig={updateConfig} />
</div>
) : (

View File

@@ -72,7 +72,7 @@ export const InstanceEmailForm: FC<IInstanceEmailForm> = (props) => {
{
key: "EMAIL_FROM",
type: "text",
label: "Sender email address",
label: "Sender's email address",
description:
"This is the email address your users will see when getting emails from this instance. You will need to verify this address.",
placeholder: "no-reply@projectplane.so",
@@ -174,12 +174,12 @@ export const InstanceEmailForm: FC<IInstanceEmailForm> = (props) => {
</div>
</div>
<div className="flex flex-col gap-6 my-6 pt-4 border-t border-custom-border-100">
<div className="flex w-full max-w-md flex-col gap-y-10 px-1">
<div className="flex w-full max-w-xl flex-col gap-y-10 px-1">
<div className="mr-8 flex items-center gap-10 pt-4">
<div className="grow">
<div className="text-sm font-medium text-custom-text-100">Authentication (optional)</div>
<div className="text-sm font-medium text-custom-text-100">Authentication</div>
<div className="text-xs font-normal text-custom-text-300">
We recommend setting up a username password for your SMTP server
This is optional, but we recommend setting up a username and a password for your SMTP server.
</div>
</div>
</div>

View File

@@ -117,17 +117,18 @@ export const GeneralConfigurationForm: FC<IGeneralConfigurationForm> = observer(
</div>
<div className="grow">
<div className="text-sm font-medium text-custom-text-100 leading-5">
Allow Plane to collect anonymous usage events
Let Plane collect anonymous usage data
</div>
<div className="text-xs font-normal text-custom-text-300 leading-5">
We collect usage events without any PII to analyse and improve Plane.{" "}
No PII is collected.This anonymized data is used to understand how you use Plane and build new features
in line with{" "}
<a
href="https://docs.plane.so/self-hosting/telemetry"
target="_blank"
className="text-custom-primary-100 hover:underline"
rel="noreferrer"
>
Know more.
our Telemetry Policy.
</a>
</div>
</div>

View File

@@ -60,9 +60,9 @@ export const IntercomConfig: FC<TIntercomConfig> = observer((props) => {
</div>
<div className="grow">
<div className="text-sm font-medium text-custom-text-100 leading-5">Talk to Plane</div>
<div className="text-sm font-medium text-custom-text-100 leading-5">Chat with us</div>
<div className="text-xs font-normal text-custom-text-300 leading-5">
Let your members chat with us via Intercom or another service. Toggling Telemetry off turns this off
Let your users chat with us via Intercom or another service. Toggling Telemetry off turns this off
automatically.
</div>
</div>

View File

@@ -0,0 +1,214 @@
import { useState, useEffect } from "react";
import Link from "next/link";
import { useRouter } from "next/navigation";
import { Controller, useForm } from "react-hook-form";
// constants
import { ORGANIZATION_SIZE, RESTRICTED_URLS } from "@plane/constants";
// types
import { IWorkspace } from "@plane/types";
// components
import { Button, CustomSelect, getButtonStyling, Input, setToast, TOAST_TYPE } from "@plane/ui";
// helpers
import { WEB_BASE_URL } from "@/helpers/common.helper";
// hooks
import { useWorkspace } from "@/hooks/store";
// services
import { WorkspaceService } from "@/services/workspace.service";
const workspaceService = new WorkspaceService();
export const WorkspaceCreateForm = () => {
// router
const router = useRouter();
// states
const [slugError, setSlugError] = useState(false);
const [invalidSlug, setInvalidSlug] = useState(false);
const [defaultValues, setDefaultValues] = useState<Partial<IWorkspace>>({
name: "",
slug: "",
organization_size: "",
});
// store hooks
const { createWorkspace } = useWorkspace();
// form info
const {
handleSubmit,
control,
setValue,
getValues,
formState: { errors, isSubmitting, isValid },
} = useForm<IWorkspace>({ defaultValues, mode: "onChange" });
// derived values
const workspaceBaseURL = encodeURI(WEB_BASE_URL || window.location.origin + "/");
const handleCreateWorkspace = async (formData: IWorkspace) => {
await workspaceService
.workspaceSlugCheck(formData.slug)
.then(async (res) => {
if (res.status === true && !RESTRICTED_URLS.includes(formData.slug)) {
setSlugError(false);
await createWorkspace(formData)
.then(async () => {
setToast({
type: TOAST_TYPE.SUCCESS,
title: "Success!",
message: "Workspace created successfully.",
});
router.push(`/workspace`);
})
.catch(() => {
setToast({
type: TOAST_TYPE.ERROR,
title: "Error!",
message: "Workspace could not be created. Please try again.",
});
});
} else setSlugError(true);
})
.catch(() => {
setToast({
type: TOAST_TYPE.ERROR,
title: "Error!",
message: "Some error occurred while creating workspace. Please try again.",
});
});
};
useEffect(
() => () => {
// when the component unmounts set the default values to whatever user typed in
setDefaultValues(getValues());
},
[getValues, setDefaultValues]
);
return (
<div className="space-y-8">
<div className="grid-col grid w-full max-w-4xl grid-cols-1 items-start justify-between gap-x-10 gap-y-6 lg:grid-cols-2">
<div className="flex flex-col gap-1">
<h4 className="text-sm text-custom-text-300">Name your workspace</h4>
<div className="flex flex-col gap-1">
<Controller
control={control}
name="name"
rules={{
required: "This is a required field.",
validate: (value) =>
/^[\w\s-]*$/.test(value) ||
`Workspaces names can contain only (" "), ( - ), ( _ ) and alphanumeric characters.`,
maxLength: {
value: 80,
message: "Limit your name to 80 characters.",
},
}}
render={({ field: { value, ref, onChange } }) => (
<Input
id="workspaceName"
type="text"
value={value}
onChange={(e) => {
onChange(e.target.value);
setValue("name", e.target.value);
setValue("slug", e.target.value.toLocaleLowerCase().trim().replace(/ /g, "-"), {
shouldValidate: true,
});
}}
ref={ref}
hasError={Boolean(errors.name)}
placeholder="Something familiar and recognizable is always best."
className="w-full"
/>
)}
/>
<span className="text-xs text-red-500">{errors?.name?.message}</span>
</div>
</div>
<div className="flex flex-col gap-1">
<h4 className="text-sm text-custom-text-300">Set your workspace&apos;s URL</h4>
<div className="flex gap-0.5 w-full items-center rounded-md border-[0.5px] border-custom-border-200 px-3">
<span className="whitespace-nowrap text-sm text-custom-text-200">{workspaceBaseURL}</span>
<Controller
control={control}
name="slug"
rules={{
required: "The URL is a required field.",
maxLength: {
value: 48,
message: "Limit your URL to 48 characters.",
},
}}
render={({ field: { onChange, value, ref } }) => (
<Input
id="workspaceUrl"
type="text"
value={value.toLocaleLowerCase().trim().replace(/ /g, "-")}
onChange={(e) => {
if (/^[a-zA-Z0-9_-]+$/.test(e.target.value)) setInvalidSlug(false);
else setInvalidSlug(true);
onChange(e.target.value.toLowerCase());
}}
ref={ref}
hasError={Boolean(errors.slug)}
placeholder="workspace-name"
className="block w-full rounded-md border-none bg-transparent !px-0 py-2 text-sm"
/>
)}
/>
</div>
{slugError && <p className="text-sm text-red-500">This URL is taken. Try something else.</p>}
{invalidSlug && (
<p className="text-sm text-red-500">{`URLs can contain only ( - ), ( _ ) and alphanumeric characters.`}</p>
)}
{errors.slug && <span className="text-xs text-red-500">{errors.slug.message}</span>}
</div>
<div className="flex flex-col gap-1">
<h4 className="text-sm text-custom-text-300">How many people will use this workspace?</h4>
<div className="w-full">
<Controller
name="organization_size"
control={control}
rules={{ required: "This is a required field." }}
render={({ field: { value, onChange } }) => (
<CustomSelect
value={value}
onChange={onChange}
label={
ORGANIZATION_SIZE.find((c) => c === value) ?? (
<span className="text-custom-text-400">Select a range</span>
)
}
buttonClassName="!border-[0.5px] !border-custom-border-200 !shadow-none"
input
optionsClassName="w-full"
>
{ORGANIZATION_SIZE.map((item) => (
<CustomSelect.Option key={item} value={item}>
{item}
</CustomSelect.Option>
))}
</CustomSelect>
)}
/>
{errors.organization_size && (
<span className="text-sm text-red-500">{errors.organization_size.message}</span>
)}
</div>
</div>
</div>
<div className="flex max-w-4xl items-center py-1 gap-4">
<Button
variant="primary"
size="sm"
onClick={handleSubmit(handleCreateWorkspace)}
disabled={!isValid}
loading={isSubmitting}
>
{isSubmitting ? "Creating workspace" : "Create workspace"}
</Button>
<Link className={getButtonStyling("neutral-primary", "sm")} href="/workspace">
Go back
</Link>
</div>
</div>
);
};

View File

@@ -0,0 +1,21 @@
"use client";
import { observer } from "mobx-react";
// components
import { WorkspaceCreateForm } from "./form";
const WorkspaceCreatePage = observer(() => (
<div className="relative container mx-auto w-full h-full p-4 py-4 space-y-6 flex flex-col">
<div className="border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<div className="text-xl font-medium text-custom-text-100">Create a new workspace on this instance.</div>
<div className="text-sm font-normal text-custom-text-300">
You will need to invite users from Workspace Settings after you create this workspace.
</div>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
<WorkspaceCreateForm />
</div>
</div>
));
export default WorkspaceCreatePage;

View File

@@ -0,0 +1,12 @@
import { ReactNode } from "react";
import { Metadata } from "next";
// layouts
import { AdminLayout } from "@/layouts/admin-layout";
export const metadata: Metadata = {
title: "Workspace Management - Plane Web",
};
export default function WorkspaceManagementLayout({ children }: { children: ReactNode }) {
return <AdminLayout>{children}</AdminLayout>;
}

View File

@@ -0,0 +1,169 @@
"use client";
import { useState } from "react";
import { observer } from "mobx-react";
import Link from "next/link";
import useSWR from "swr";
import { Loader as LoaderIcon } from "lucide-react";
// types
import { TInstanceConfigurationKeys } from "@plane/types";
// ui
import { Button, getButtonStyling, Loader, setPromiseToast, ToggleSwitch } from "@plane/ui";
// components
import { WorkspaceListItem } from "@/components/workspace";
// helpers
import { cn } from "@/helpers/common.helper";
// hooks
import { useInstance, useWorkspace } from "@/hooks/store";
const WorkspaceManagementPage = observer(() => {
// states
const [isSubmitting, setIsSubmitting] = useState<boolean>(false);
// store
const { formattedConfig, fetchInstanceConfigurations, updateInstanceConfigurations } = useInstance();
const {
workspaceIds,
loader: workspaceLoader,
paginationInfo,
fetchWorkspaces,
fetchNextWorkspaces,
} = useWorkspace();
// derived values
const disableWorkspaceCreation = formattedConfig?.DISABLE_WORKSPACE_CREATION ?? "";
const hasNextPage = paginationInfo?.next_page_results && paginationInfo?.next_cursor !== undefined;
// fetch data
useSWR("INSTANCE_CONFIGURATIONS", () => fetchInstanceConfigurations());
useSWR("INSTANCE_WORKSPACES", () => fetchWorkspaces());
const updateConfig = async (key: TInstanceConfigurationKeys, value: string) => {
setIsSubmitting(true);
const payload = {
[key]: value,
};
const updateConfigPromise = updateInstanceConfigurations(payload);
setPromiseToast(updateConfigPromise, {
loading: "Saving configuration",
success: {
title: "Success",
message: () => "Configuration saved successfully",
},
error: {
title: "Error",
message: () => "Failed to save configuration",
},
});
await updateConfigPromise
.then(() => {
setIsSubmitting(false);
})
.catch((err) => {
console.error(err);
setIsSubmitting(false);
});
};
return (
<div className="relative container mx-auto w-full h-full p-4 py-4 space-y-6 flex flex-col">
<div className="flex items-center justify-between gap-4 border-b border-custom-border-100 mx-4 py-4 space-y-1 flex-shrink-0">
<div className="flex flex-col gap-1">
<div className="text-xl font-medium text-custom-text-100">Workspaces on this instance</div>
<div className="text-sm font-normal text-custom-text-300">
See all workspaces and control who can create them.
</div>
</div>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
<div className="space-y-3">
{formattedConfig ? (
<div className={cn("w-full flex items-center gap-14 rounded")}>
<div className="flex grow items-center gap-4">
<div className="grow">
<div className="text-lg font-medium pb-1">Prevent anyone else from creating a workspace.</div>
<div className={cn("font-normal leading-5 text-custom-text-300 text-xs")}>
Toggling this on will let only you create workspaces. You will have to invite users to new
workspaces.
</div>
</div>
</div>
<div className={`shrink-0 pr-4 ${isSubmitting && "opacity-70"}`}>
<div className="flex items-center gap-4">
<ToggleSwitch
value={Boolean(parseInt(disableWorkspaceCreation))}
onChange={() => {
if (Boolean(parseInt(disableWorkspaceCreation)) === true) {
updateConfig("DISABLE_WORKSPACE_CREATION", "0");
} else {
updateConfig("DISABLE_WORKSPACE_CREATION", "1");
}
}}
size="sm"
disabled={isSubmitting}
/>
</div>
</div>
</div>
) : (
<Loader>
<Loader.Item height="50px" width="100%" />
</Loader>
)}
{workspaceLoader !== "init-loader" ? (
<>
<div className="pt-6 flex items-center justify-between gap-2">
<div className="flex flex-col items-start gap-x-2">
<div className="flex items-center gap-2 text-lg font-medium">
All workspaces on this instance{" "}
<span className="text-custom-text-300"> {workspaceIds.length}</span>
{workspaceLoader && ["mutation", "pagination"].includes(workspaceLoader) && (
<LoaderIcon className="w-4 h-4 animate-spin" />
)}
</div>
<div className={cn("font-normal leading-5 text-custom-text-300 text-xs")}>
You can&apos;t yet delete workspaces and you can only go to the workspace if you are an Admin or a
Member.
</div>
</div>
<div className="flex items-center gap-2">
<Link href="/workspace/create" className={getButtonStyling("primary", "sm")}>
Create workspace
</Link>
</div>
</div>
<div className="flex flex-col gap-4 py-2">
{workspaceIds.map((workspaceId) => (
<WorkspaceListItem key={workspaceId} workspaceId={workspaceId} />
))}
</div>
{hasNextPage && (
<div className="flex justify-center">
<Button
variant="link-primary"
onClick={() => fetchNextWorkspaces()}
disabled={workspaceLoader === "pagination"}
>
Load more
{workspaceLoader === "pagination" && <LoaderIcon className="w-3 h-3 animate-spin" />}
</Button>
</div>
)}
</>
) : (
<Loader className="space-y-10 py-8">
<Loader.Item height="24px" width="20%" />
<Loader.Item height="92px" width="100%" />
<Loader.Item height="92px" width="100%" />
<Loader.Item height="92px" width="100%" />
</Loader>
)}
</div>
</div>
</div>
);
});
export default WorkspaceManagementPage;

View File

@@ -9,8 +9,8 @@ import { getButtonStyling } from "@plane/ui";
import { cn } from "@/helpers/common.helper";
export const UpgradeButton: React.FC = () => (
<a href="https://plane.so/one" target="_blank" className={cn(getButtonStyling("primary", "sm"))}>
Available on One
<a href="https://plane.so/pricing?mode=self-hosted" target="_blank" className={cn(getButtonStyling("primary", "sm"))}>
Upgrade
<SquareArrowOutUpRight className="h-3.5 w-3.5 p-0.5" />
</a>
);

View File

@@ -52,13 +52,13 @@ export const HelpSection: FC = observer(() => {
)}
>
<div className={`flex items-center gap-1 ${isSidebarCollapsed ? "flex-col justify-center" : "w-full"}`}>
<Tooltip tooltipContent="Redirect to plane" position="right" className="ml-4" disabled={!isSidebarCollapsed}>
<Tooltip tooltipContent="Redirect to Plane" position="right" className="ml-4" disabled={!isSidebarCollapsed}>
<a
href={redirectionLink}
className={`relative px-2 py-1.5 flex items-center gap-2 font-medium rounded border border-custom-primary-100/20 bg-custom-primary-100/10 text-xs text-custom-primary-200 whitespace-nowrap`}
>
<ExternalLink size={14} />
{!isSidebarCollapsed && "Redirect to plane"}
{!isSidebarCollapsed && "Redirect to Plane"}
</a>
</Tooltip>
<Tooltip tooltipContent="Help" position={isSidebarCollapsed ? "right" : "top"} className="ml-4">

View File

@@ -3,7 +3,7 @@
import { FC, useEffect, useRef } from "react";
import { observer } from "mobx-react";
// plane helpers
import { useOutsideClickDetector } from "@plane/helpers";
import { useOutsideClickDetector } from "@plane/hooks";
// components
import { HelpSection, SidebarMenu, SidebarDropdown } from "@/components/admin-sidebar";
// hooks

View File

@@ -5,11 +5,13 @@ import { observer } from "mobx-react";
import { useTheme as useNextTheme } from "next-themes";
import { LogOut, UserCog2, Palette } from "lucide-react";
import { Menu, Transition } from "@headlessui/react";
// plane ui
import { Avatar } from "@plane/ui";
// hooks
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { useTheme, useUser } from "@/hooks/store";
// helpers
import { API_BASE_URL, cn } from "@/helpers/common.helper";
import { getFileURL } from "@/helpers/file.helper";
// hooks
import { useTheme, useUser } from "@/hooks/store";
// services
import { AuthService } from "@/services/auth.service";
@@ -122,7 +124,7 @@ export const SidebarDropdown = observer(() => {
<Menu.Button className="grid place-items-center outline-none">
<Avatar
name={currentUser.display_name}
src={currentUser.avatar ?? undefined}
src={getFileURL(currentUser.avatar_url)}
size={24}
shape="square"
className="!text-base"

View File

@@ -4,7 +4,7 @@ import { observer } from "mobx-react";
import Link from "next/link";
import { usePathname } from "next/navigation";
import { Image, BrainCog, Cog, Lock, Mail } from "lucide-react";
import { Tooltip } from "@plane/ui";
import { Tooltip, WorkspaceIcon } from "@plane/ui";
// hooks
import { cn } from "@/helpers/common.helper";
import { useTheme } from "@/hooks/store";
@@ -14,31 +14,37 @@ const INSTANCE_ADMIN_LINKS = [
{
Icon: Cog,
name: "General",
description: "Identify your instances and get key details",
description: "Identify your instances and get key details.",
href: `/general/`,
},
{
Icon: WorkspaceIcon,
name: "Workspaces",
description: "Manage all workspaces on this instance.",
href: `/workspace/`,
},
{
Icon: Mail,
name: "Email",
description: "Set up emails to your users",
description: "Configure your SMTP controls.",
href: `/email/`,
},
{
Icon: Lock,
name: "Authentication",
description: "Configure authentication modes",
description: "Configure authentication modes.",
href: `/authentication/`,
},
{
Icon: BrainCog,
name: "Artificial intelligence",
description: "Configure your OpenAI creds",
description: "Configure your OpenAI creds.",
href: `/ai/`,
},
{
Icon: Image,
name: "Images in Plane",
description: "Allow third-party image libraries",
description: "Allow third-party image libraries.",
href: `/image/`,
},
];

View File

@@ -30,9 +30,13 @@ export const InstanceHeader: FC = observer(() => {
case "google":
return "Google";
case "github":
return "Github";
return "GitHub";
case "gitlab":
return "GitLab";
case "workspace":
return "Workspace";
case "create":
return "Create";
default:
return pathName.toUpperCase();
}

View File

@@ -1,13 +1,13 @@
"use client";
import React from "react";
import { resolveGeneralTheme } from "helpers/common.helper";
import { observer } from "mobx-react";
import Image from "next/image";
import Link from "next/link";
import { useTheme as nextUseTheme } from "next-themes";
// ui
import { Button, getButtonStyling } from "@plane/ui";
// helpers
import { WEB_BASE_URL, resolveGeneralTheme } from "helpers/common.helper";
// hooks
import { useTheme } from "@/hooks/store";
// icons
@@ -20,8 +20,6 @@ export const NewUserPopup: React.FC = observer(() => {
// theme
const { resolvedTheme } = nextUseTheme();
const redirectionLink = encodeURI(WEB_BASE_URL + "/create-workspace");
if (!isNewUserPopup) return <></>;
return (
<div className="absolute bottom-8 right-8 p-6 w-96 border border-custom-border-100 shadow-md rounded-lg bg-custom-background-100">
@@ -30,12 +28,12 @@ export const NewUserPopup: React.FC = observer(() => {
<div className="text-base font-semibold">Create workspace</div>
<div className="py-2 text-sm font-medium text-custom-text-300">
Instance setup done! Welcome to Plane instance portal. Start your journey with by creating your first
workspace, you will need to login again.
workspace.
</div>
<div className="flex items-center gap-4 pt-2">
<a href={redirectionLink} className={getButtonStyling("primary", "sm")}>
<Link href="/workspace/create" className={getButtonStyling("primary", "sm")}>
Create workspace
</a>
</Link>
<Button variant="neutral-primary" size="sm" onClick={toggleNewUserPopup}>
Close
</Button>

View File

@@ -0,0 +1 @@
export * from "./list-item";

View File

@@ -0,0 +1,81 @@
import { observer } from "mobx-react";
import { ExternalLink } from "lucide-react";
// helpers
import { Tooltip } from "@plane/ui";
import { WEB_BASE_URL } from "@/helpers/common.helper";
import { getFileURL } from "@/helpers/file.helper";
// hooks
import { useWorkspace } from "@/hooks/store";
type TWorkspaceListItemProps = {
workspaceId: string;
};
export const WorkspaceListItem = observer(({ workspaceId }: TWorkspaceListItemProps) => {
// store hooks
const { getWorkspaceById } = useWorkspace();
// derived values
const workspace = getWorkspaceById(workspaceId);
if (!workspace) return null;
return (
<a
key={workspaceId}
href={`${WEB_BASE_URL}/${encodeURIComponent(workspace.slug)}`}
target="_blank"
className="group flex items-center justify-between p-4 gap-2.5 truncate border border-custom-border-200/70 hover:border-custom-border-200 hover:bg-custom-background-90 rounded-md"
>
<div className="flex items-start gap-4">
<span
className={`relative flex h-8 w-8 flex-shrink-0 items-center justify-center p-2 mt-1 text-xs uppercase ${
!workspace?.logo_url && "rounded bg-custom-primary-500 text-white"
}`}
>
{workspace?.logo_url && workspace.logo_url !== "" ? (
<img
src={getFileURL(workspace.logo_url)}
className="absolute left-0 top-0 h-full w-full rounded object-cover"
alt="Workspace Logo"
/>
) : (
(workspace?.name?.[0] ?? "...")
)}
</span>
<div className="flex flex-col items-start gap-1">
<div className="flex flex-wrap w-full items-center gap-2.5">
<h3 className={`text-base font-medium capitalize`}>{workspace.name}</h3>/
<Tooltip tooltipContent="The unique URL of your workspace">
<h4 className="text-sm text-custom-text-300">[{workspace.slug}]</h4>
</Tooltip>
</div>
{workspace.owner.email && (
<div className="flex items-center gap-1 text-xs">
<h3 className="text-custom-text-200 font-medium">Owned by:</h3>
<h4 className="text-custom-text-300">{workspace.owner.email}</h4>
</div>
)}
<div className="flex items-center gap-2.5 text-xs">
{workspace.total_projects !== null && (
<span className="flex items-center gap-1">
<h3 className="text-custom-text-200 font-medium">Total projects:</h3>
<h4 className="text-custom-text-300">{workspace.total_projects}</h4>
</span>
)}
{workspace.total_members !== null && (
<>
<span className="flex items-center gap-1">
<h3 className="text-custom-text-200 font-medium">Total members:</h3>
<h4 className="text-custom-text-300">{workspace.total_members}</h4>
</span>
</>
)}
</div>
</div>
</div>
<div className="flex-shrink-0">
<ExternalLink size={14} className="text-custom-text-400 group-hover:text-custom-text-200" />
</div>
</a>
);
});

View File

@@ -1,3 +1,4 @@
export * from "./use-theme";
export * from "./use-instance";
export * from "./use-user";
export * from "./use-workspace";

View File

@@ -0,0 +1,10 @@
import { useContext } from "react";
// store
import { StoreContext } from "@/lib/store-provider";
import { IWorkspaceStore } from "@/store/workspace.store";
export const useWorkspace = (): IWorkspaceStore => {
const context = useContext(StoreContext);
if (context === undefined) throw new Error("useWorkspace must be used within StoreProvider");
return context.workspace;
};

View File

@@ -0,0 +1,53 @@
// types
import type { IWorkspace, TWorkspacePaginationInfo } from "@plane/types";
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
// services
import { APIService } from "@/services/api.service";
export class WorkspaceService extends APIService {
constructor() {
super(API_BASE_URL);
}
/**
* @description Fetches all workspaces
* @returns Promise<TWorkspacePaginationInfo>
*/
async getWorkspaces(nextPageCursor?: string): Promise<TWorkspacePaginationInfo> {
return this.get<TWorkspacePaginationInfo>("/api/instances/workspaces/", {
cursor: nextPageCursor,
})
.then((response) => response.data)
.catch((error) => {
throw error?.response?.data;
});
}
/**
* @description Checks if a slug is available
* @param slug - string
* @returns Promise<any>
*/
async workspaceSlugCheck(slug: string): Promise<any> {
const params = new URLSearchParams({ slug });
return this.get(`/api/instances/workspace-slug-check/?${params.toString()}`)
.then((response) => response?.data)
.catch((error) => {
throw error?.response?.data;
});
}
/**
* @description Creates a new workspace
* @param data - IWorkspace
* @returns Promise<IWorkspace>
*/
async createWorkspace(data: IWorkspace): Promise<IWorkspace> {
return this.post<IWorkspace, IWorkspace>("/api/instances/workspaces/", data)
.then((response) => response.data)
.catch((error) => {
throw error?.response?.data;
});
}
}

View File

@@ -3,6 +3,7 @@ import { enableStaticRendering } from "mobx-react";
import { IInstanceStore, InstanceStore } from "./instance.store";
import { IThemeStore, ThemeStore } from "./theme.store";
import { IUserStore, UserStore } from "./user.store";
import { IWorkspaceStore, WorkspaceStore } from "./workspace.store";
enableStaticRendering(typeof window === "undefined");
@@ -10,17 +11,20 @@ export abstract class CoreRootStore {
theme: IThemeStore;
instance: IInstanceStore;
user: IUserStore;
workspace: IWorkspaceStore;
constructor() {
this.theme = new ThemeStore(this);
this.instance = new InstanceStore(this);
this.user = new UserStore(this);
this.workspace = new WorkspaceStore(this);
}
hydrate(initialData: any) {
this.theme.hydrate(initialData.theme);
this.instance.hydrate(initialData.instance);
this.user.hydrate(initialData.user);
this.workspace.hydrate(initialData.workspace);
}
resetOnSignOut() {
@@ -28,5 +32,6 @@ export abstract class CoreRootStore {
this.instance = new InstanceStore(this);
this.user = new UserStore(this);
this.theme = new ThemeStore(this);
this.workspace = new WorkspaceStore(this);
}
}

View File

@@ -0,0 +1,150 @@
import set from "lodash/set";
import { action, observable, runInAction, makeObservable, computed } from "mobx";
import { IWorkspace, TLoader, TPaginationInfo } from "@plane/types";
// services
import { WorkspaceService } from "@/services/workspace.service";
// root store
import { CoreRootStore } from "@/store/root.store";
export interface IWorkspaceStore {
// observables
loader: TLoader;
workspaces: Record<string, IWorkspace>;
paginationInfo: TPaginationInfo | undefined;
// computed
workspaceIds: string[];
// helper actions
hydrate: (data: Record<string, IWorkspace>) => void;
getWorkspaceById: (workspaceId: string) => IWorkspace | undefined;
// fetch actions
fetchWorkspaces: () => Promise<IWorkspace[]>;
fetchNextWorkspaces: () => Promise<IWorkspace[]>;
// curd actions
createWorkspace: (data: IWorkspace) => Promise<IWorkspace>;
}
export class WorkspaceStore implements IWorkspaceStore {
// observables
loader: TLoader = "init-loader";
workspaces: Record<string, IWorkspace> = {};
paginationInfo: TPaginationInfo | undefined = undefined;
// services
workspaceService;
constructor(private store: CoreRootStore) {
makeObservable(this, {
// observables
loader: observable,
workspaces: observable,
paginationInfo: observable,
// computed
workspaceIds: computed,
// helper actions
hydrate: action,
getWorkspaceById: action,
// fetch actions
fetchWorkspaces: action,
fetchNextWorkspaces: action,
// curd actions
createWorkspace: action,
});
this.workspaceService = new WorkspaceService();
}
// computed
get workspaceIds() {
return Object.keys(this.workspaces);
}
// helper actions
/**
* @description Hydrates the workspaces
* @param data - Record<string, IWorkspace>
*/
hydrate = (data: Record<string, IWorkspace>) => {
if (data) this.workspaces = data;
};
/**
* @description Gets a workspace by id
* @param workspaceId - string
* @returns IWorkspace | undefined
*/
getWorkspaceById = (workspaceId: string) => this.workspaces[workspaceId];
// fetch actions
/**
* @description Fetches all workspaces
* @returns Promise<>
*/
fetchWorkspaces = async (): Promise<IWorkspace[]> => {
try {
if (this.workspaceIds.length > 0) {
this.loader = "mutation";
} else {
this.loader = "init-loader";
}
const paginatedWorkspaceData = await this.workspaceService.getWorkspaces();
runInAction(() => {
const { results, ...paginationInfo } = paginatedWorkspaceData;
results.forEach((workspace: IWorkspace) => {
set(this.workspaces, [workspace.id], workspace);
});
set(this, "paginationInfo", paginationInfo);
});
return paginatedWorkspaceData.results;
} catch (error) {
console.error("Error fetching workspaces", error);
throw error;
} finally {
this.loader = "loaded";
}
};
/**
* @description Fetches the next page of workspaces
* @returns Promise<IWorkspace[]>
*/
fetchNextWorkspaces = async (): Promise<IWorkspace[]> => {
if (!this.paginationInfo || this.paginationInfo.next_page_results === false) return [];
try {
this.loader = "pagination";
const paginatedWorkspaceData = await this.workspaceService.getWorkspaces(this.paginationInfo.next_cursor);
runInAction(() => {
const { results, ...paginationInfo } = paginatedWorkspaceData;
results.forEach((workspace: IWorkspace) => {
set(this.workspaces, [workspace.id], workspace);
});
set(this, "paginationInfo", paginationInfo);
});
return paginatedWorkspaceData.results;
} catch (error) {
console.error("Error fetching next workspaces", error);
throw error;
} finally {
this.loader = "loaded";
}
};
// curd actions
/**
* @description Creates a new workspace
* @param data - IWorkspace
* @returns Promise<IWorkspace>
*/
createWorkspace = async (data: IWorkspace): Promise<IWorkspace> => {
try {
this.loader = "mutation";
const workspace = await this.workspaceService.createWorkspace(data);
runInAction(() => {
set(this.workspaces, [workspace.id], workspace);
});
return workspace;
} catch (error) {
console.error("Error creating workspace", error);
throw error;
} finally {
this.loader = "loaded";
}
};
}

View File

@@ -0,0 +1,14 @@
// helpers
import { API_BASE_URL } from "@/helpers/common.helper";
/**
* @description combine the file path with the base URL
* @param {string} path
* @returns {string} final URL with the base URL
*/
export const getFileURL = (path: string): string | undefined => {
if (!path) return undefined;
const isValidURL = path.startsWith("http");
if (isValidURL) return path;
return `${API_BASE_URL}${path}`;
};

View File

@@ -0,0 +1,21 @@
/**
* @description
* This function test whether a URL is valid or not.
*
* It accepts URLs with or without the protocol.
* @param {string} url
* @returns {boolean}
* @example
* checkURLValidity("https://example.com") => true
* checkURLValidity("example.com") => true
* checkURLValidity("example") => false
*/
export const checkURLValidity = (url: string): boolean => {
if (!url) return false;
// regex to support complex query parameters and fragments
const urlPattern =
/^(https?:\/\/)?((([a-z\d-]+\.)*[a-z\d-]+\.[a-z]{2,6})|(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}))(:\d+)?(\/[\w.-]*)*(\?[^#\s]*)?(#[\w-]*)?$/i;
return urlPattern.test(url);
};

View File

@@ -1,6 +1,6 @@
{
"name": "admin",
"version": "0.23.0",
"version": "0.24.1",
"private": true,
"scripts": {
"dev": "turbo run develop",
@@ -14,20 +14,20 @@
"dependencies": {
"@headlessui/react": "^1.7.19",
"@plane/constants": "*",
"@plane/helpers": "*",
"@plane/hooks": "*",
"@plane/types": "*",
"@plane/ui": "*",
"@plane/utils": "*",
"@sentry/nextjs": "^8.32.0",
"@tailwindcss/typography": "^0.5.9",
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
"axios": "^1.7.4",
"js-cookie": "^3.0.5",
"lodash": "^4.17.21",
"lucide-react": "^0.356.0",
"mobx": "^6.12.0",
"mobx-react": "^9.1.1",
"next": "^14.2.12",
"next": "^14.2.20",
"next-themes": "^0.2.1",
"postcss": "^8.4.38",
"react": "^18.3.1",
@@ -41,9 +41,8 @@
"devDependencies": {
"@plane/eslint-config": "*",
"@plane/typescript-config": "*",
"@types/js-cookie": "^3.0.6",
"@types/node": "18.16.1",
"@types/react": "^18.2.48",
"@types/react": "^18.3.11",
"@types/react-dom": "^18.2.18",
"@types/uuid": "^9.0.8",
"@types/zxcvbn": "^4.4.4",

View File

@@ -57,5 +57,6 @@ ADMIN_BASE_URL=
SPACE_BASE_URL=
APP_BASE_URL=
# Hard delete files after days
HARD_DELETE_AFTER_DAYS=60
HARD_DELETE_AFTER_DAYS=60

View File

@@ -4,6 +4,7 @@ FROM python:3.12.5-alpine AS backend
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV INSTANCE_CHANGELOG_URL https://sites.plane.so/pages/691ef037bcfe416a902e48cb55f59891/
WORKDIR /code

View File

@@ -4,6 +4,7 @@ FROM python:3.12.5-alpine AS backend
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
ENV PIP_DISABLE_PIP_VERSION_CHECK=1
ENV INSTANCE_CHANGELOG_URL https://sites.plane.so/pages/691ef037bcfe416a902e48cb55f59891/
RUN apk --no-cache add \
"bash~=5.2" \

View File

@@ -26,9 +26,7 @@ def update_description():
updated_issues.append(issue)
Issue.objects.bulk_update(
updated_issues,
["description_html", "description_stripped"],
batch_size=100,
updated_issues, ["description_html", "description_stripped"], batch_size=100
)
print("Success")
except Exception as e:
@@ -42,9 +40,7 @@ def update_comments():
updated_issue_comments = []
for issue_comment in issue_comments:
issue_comment.comment_html = (
f"<p>{issue_comment.comment_stripped}</p>"
)
issue_comment.comment_html = f"<p>{issue_comment.comment_stripped}</p>"
updated_issue_comments.append(issue_comment)
IssueComment.objects.bulk_update(
@@ -103,9 +99,7 @@ def updated_issue_sort_order():
issue.sort_order = issue.sequence_id * random.randint(100, 500)
updated_issues.append(issue)
Issue.objects.bulk_update(
updated_issues, ["sort_order"], batch_size=100
)
Issue.objects.bulk_update(updated_issues, ["sort_order"], batch_size=100)
print("Success")
except Exception as e:
print(e)
@@ -143,9 +137,7 @@ def update_project_cover_images():
project.cover_image = project_cover_images[random.randint(0, 19)]
updated_projects.append(project)
Project.objects.bulk_update(
updated_projects, ["cover_image"], batch_size=100
)
Project.objects.bulk_update(updated_projects, ["cover_image"], batch_size=100)
print("Success")
except Exception as e:
print(e)
@@ -194,9 +186,7 @@ def update_label_color():
def create_slack_integration():
try:
_ = Integration.objects.create(
provider="slack", network=2, title="Slack"
)
_ = Integration.objects.create(provider="slack", network=2, title="Slack")
print("Success")
except Exception as e:
print(e)
@@ -222,16 +212,12 @@ def update_integration_verified():
def update_start_date():
try:
issues = Issue.objects.filter(
state__group__in=["started", "completed"]
)
issues = Issue.objects.filter(state__group__in=["started", "completed"])
updated_issues = []
for issue in issues:
issue.start_date = issue.created_at.date()
updated_issues.append(issue)
Issue.objects.bulk_update(
updated_issues, ["start_date"], batch_size=500
)
Issue.objects.bulk_update(updated_issues, ["start_date"], batch_size=500)
print("Success")
except Exception as e:
print(e)

View File

@@ -3,9 +3,7 @@ import os
import sys
if __name__ == "__main__":
os.environ.setdefault(
"DJANGO_SETTINGS_MODULE", "plane.settings.production"
)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "plane.settings.production")
try:
from django.core.management import execute_from_command_line
except ImportError as exc:

View File

@@ -1,4 +1,4 @@
{
"name": "plane-api",
"version": "0.23.0"
"version": "0.24.1"
}

View File

@@ -25,10 +25,7 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(
Q(expired_at__gt=timezone.now())
| Q(expired_at__isnull=True)
),
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)

View File

@@ -80,4 +80,4 @@ class ServiceTokenRateThrottle(SimpleRateThrottle):
request.META["X-RateLimit-Remaining"] = max(0, available)
request.META["X-RateLimit-Reset"] = reset_time
return allowed
return allowed

View File

@@ -5,7 +5,6 @@ from .issue import (
IssueSerializer,
LabelSerializer,
IssueLinkSerializer,
IssueAttachmentSerializer,
IssueCommentSerializer,
IssueAttachmentSerializer,
IssueActivitySerializer,
@@ -14,9 +13,5 @@ from .issue import (
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import (
ModuleSerializer,
ModuleIssueSerializer,
ModuleLiteSerializer,
)
from .inbox import InboxIssueSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .intake import IntakeIssueSerializer

View File

@@ -102,8 +102,6 @@ class BaseSerializer(serializers.ModelSerializer):
response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
response[expand] = getattr(
instance, f"{expand}_id", None
)
response[expand] = getattr(instance, f"{expand}_id", None)
return response

View File

@@ -4,7 +4,7 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from plane.db.models import Cycle, CycleIssue
from plane.utils.timezone_converter import convert_to_utc
class CycleSerializer(BaseSerializer):
total_issues = serializers.IntegerField(read_only=True)
@@ -23,8 +23,18 @@ class CycleSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed end date"
raise serializers.ValidationError("Start date cannot exceed end date")
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
):
project_id = self.initial_data.get("project_id") or self.instance.project_id
data["start_date"] = convert_to_utc(
str(data.get("start_date").date()), project_id, is_start_date=True
)
data["end_date"] = convert_to_utc(
str(data.get("end_date", None).date()), project_id
)
return data
@@ -50,11 +60,7 @@ class CycleIssueSerializer(BaseSerializer):
class Meta:
model = CycleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"cycle",
]
read_only_fields = ["workspace", "project", "cycle"]
class CycleLiteSerializer(BaseSerializer):

View File

@@ -1,15 +1,16 @@
# Module improts
from .base import BaseSerializer
from .issue import IssueExpandSerializer
from plane.db.models import InboxIssue
from plane.db.models import IntakeIssue
from rest_framework import serializers
class InboxIssueSerializer(BaseSerializer):
class IntakeIssueSerializer(BaseSerializer):
issue_detail = IssueExpandSerializer(read_only=True, source="issue")
inbox = serializers.UUIDField(source="intake.id", read_only=True)
class Meta:
model = InboxIssue
model = IntakeIssue
fields = "__all__"
read_only_fields = [
"id",

View File

@@ -11,7 +11,7 @@ from plane.db.models import (
IssueType,
IssueActivity,
IssueAssignee,
IssueAttachment,
FileAsset,
IssueComment,
IssueLabel,
IssueLink,
@@ -31,6 +31,7 @@ from .user import UserLiteSerializer
from django.core.exceptions import ValidationError
from django.core.validators import URLValidator
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(
@@ -48,25 +49,13 @@ class IssueSerializer(BaseSerializer):
required=False,
)
type_id = serializers.PrimaryKeyRelatedField(
source="type",
queryset=IssueType.objects.all(),
required=False,
allow_null=True,
source="type", queryset=IssueType.objects.all(), required=False, allow_null=True
)
class Meta:
model = Issue
read_only_fields = [
"id",
"workspace",
"project",
"updated_by",
"updated_at",
]
exclude = [
"description",
"description_stripped",
]
read_only_fields = ["id", "workspace", "project", "updated_by", "updated_at"]
exclude = ["description", "description_stripped"]
def validate(self, data):
if (
@@ -74,9 +63,7 @@ class IssueSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
raise serializers.ValidationError("Start date cannot exceed target date")
try:
if data.get("description_html", None) is not None:
@@ -98,16 +85,14 @@ class IssueSerializer(BaseSerializer):
# Validate labels are from project
if data.get("labels", []):
data["labels"] = Label.objects.filter(
project_id=self.context.get("project_id"),
id__in=data["labels"],
project_id=self.context.get("project_id"), id__in=data["labels"]
).values_list("id", flat=True)
# Check state is from the project only else raise validation error
if (
data.get("state")
and not State.objects.filter(
project_id=self.context.get("project_id"),
pk=data.get("state").id,
project_id=self.context.get("project_id"), pk=data.get("state").id
).exists()
):
raise serializers.ValidationError(
@@ -118,8 +103,7 @@ class IssueSerializer(BaseSerializer):
if (
data.get("parent")
and not Issue.objects.filter(
workspace_id=self.context.get("workspace_id"),
pk=data.get("parent").id,
workspace_id=self.context.get("workspace_id"), pk=data.get("parent").id
).exists()
):
raise serializers.ValidationError(
@@ -146,9 +130,7 @@ class IssueSerializer(BaseSerializer):
issue_type = issue_type
issue = Issue.objects.create(
**validated_data,
project_id=project_id,
type=issue_type,
**validated_data, project_id=project_id, type=issue_type
)
# Issue Audit Users
@@ -263,13 +245,9 @@ class IssueSerializer(BaseSerializer):
]
if "labels" in self.fields:
if "labels" in self.expand:
data["labels"] = LabelSerializer(
instance.labels.all(), many=True
).data
data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
else:
data["labels"] = [
str(label.id) for label in instance.labels.all()
]
data["labels"] = [str(label.id) for label in instance.labels.all()]
return data
@@ -277,11 +255,7 @@ class IssueSerializer(BaseSerializer):
class IssueLiteSerializer(BaseSerializer):
class Meta:
model = Issue
fields = [
"id",
"sequence_id",
"project_id",
]
fields = ["id", "sequence_id", "project_id"]
read_only_fields = fields
@@ -315,7 +289,7 @@ class IssueLinkSerializer(BaseSerializer):
"created_at",
"updated_at",
]
def validate_url(self, value):
# Check URL format
validate_url = URLValidator()
@@ -333,8 +307,7 @@ class IssueLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=validated_data.get("issue_id"),
url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -344,8 +317,7 @@ class IssueLinkSerializer(BaseSerializer):
def update(self, instance, validated_data):
if (
IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
url=validated_data.get("url"), issue_id=instance.issue_id
)
.exclude(pk=instance.id)
.exists()
@@ -359,7 +331,7 @@ class IssueLinkSerializer(BaseSerializer):
class IssueAttachmentSerializer(BaseSerializer):
class Meta:
model = IssueAttachment
model = FileAsset
fields = "__all__"
read_only_fields = [
"id",
@@ -386,10 +358,7 @@ class IssueCommentSerializer(BaseSerializer):
"created_at",
"updated_at",
]
exclude = [
"comment_stripped",
"comment_json",
]
exclude = ["comment_stripped", "comment_json"]
def validate(self, data):
try:
@@ -406,38 +375,27 @@ class IssueCommentSerializer(BaseSerializer):
class IssueActivitySerializer(BaseSerializer):
class Meta:
model = IssueActivity
exclude = [
"created_by",
"updated_by",
]
exclude = ["created_by", "updated_by"]
class CycleIssueSerializer(BaseSerializer):
cycle = CycleSerializer(read_only=True)
class Meta:
fields = [
"cycle",
]
fields = ["cycle"]
class ModuleIssueSerializer(BaseSerializer):
module = ModuleSerializer(read_only=True)
class Meta:
fields = [
"module",
]
fields = ["module"]
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [
"id",
"name",
"color",
]
fields = ["id", "name", "color"]
class IssueExpandSerializer(BaseSerializer):

View File

@@ -53,14 +53,11 @@ class ModuleSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
raise serializers.ValidationError("Start date cannot exceed target date")
if data.get("members", []):
data["members"] = ProjectMember.objects.filter(
project_id=self.context.get("project_id"),
member_id__in=data["members"],
project_id=self.context.get("project_id"), member_id__in=data["members"]
).values_list("member_id", flat=True)
return data
@@ -74,9 +71,7 @@ class ModuleSerializer(BaseSerializer):
module_name = validated_data.get("name")
if module_name:
# Lookup for the module name in the module table for that project
if Module.objects.filter(
name=module_name, project_id=project_id
).exists():
if Module.objects.filter(name=module_name, project_id=project_id).exists():
raise serializers.ValidationError(
{"error": "Module with this name already exists"}
)
@@ -107,9 +102,7 @@ class ModuleSerializer(BaseSerializer):
if module_name:
# Lookup for the module name in the module table for that project
if (
Module.objects.filter(
name=module_name, project=instance.project
)
Module.objects.filter(name=module_name, project=instance.project)
.exclude(id=instance.id)
.exists()
):
@@ -172,8 +165,7 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if ModuleLink.objects.filter(
url=validated_data.get("url"),
module_id=validated_data.get("module_id"),
url=validated_data.get("url"), module_id=validated_data.get("module_id")
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}

View File

@@ -2,11 +2,7 @@
from rest_framework import serializers
# Module imports
from plane.db.models import (
Project,
ProjectIdentifier,
WorkspaceMember,
)
from plane.db.models import Project, ProjectIdentifier, WorkspaceMember
from .base import BaseSerializer
@@ -19,6 +15,8 @@ class ProjectSerializer(BaseSerializer):
sort_order = serializers.FloatField(read_only=True)
member_role = serializers.IntegerField(read_only=True)
is_deployed = serializers.BooleanField(read_only=True)
cover_image_url = serializers.CharField(read_only=True)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
class Meta:
model = Project
@@ -32,6 +30,7 @@ class ProjectSerializer(BaseSerializer):
"created_by",
"updated_by",
"deleted_at",
"cover_image_url",
]
def validate(self, data):
@@ -64,16 +63,12 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(
detail="Project Identifier is required"
)
raise serializers.ValidationError(detail="Project Identifier is required")
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(
detail="Project Identifier is taken"
)
raise serializers.ValidationError(detail="Project Identifier is taken")
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
@@ -87,6 +82,8 @@ class ProjectSerializer(BaseSerializer):
class ProjectLiteSerializer(BaseSerializer):
cover_image_url = serializers.CharField(read_only=True)
class Meta:
model = Project
fields = [
@@ -97,5 +94,6 @@ class ProjectLiteSerializer(BaseSerializer):
"icon_prop",
"emoji",
"description",
"cover_image_url",
]
read_only_fields = fields

View File

@@ -7,9 +7,9 @@ class StateSerializer(BaseSerializer):
def validate(self, data):
# If the default is being provided then make all other states default False
if data.get("default", False):
State.objects.filter(
project_id=self.context.get("project_id")
).update(default=False)
State.objects.filter(project_id=self.context.get("project_id")).update(
default=False
)
return data
class Meta:
@@ -30,10 +30,5 @@ class StateSerializer(BaseSerializer):
class StateLiteSerializer(BaseSerializer):
class Meta:
model = State
fields = [
"id",
"name",
"color",
"group",
]
fields = ["id", "name", "color", "group"]
read_only_fields = fields

View File

@@ -13,6 +13,7 @@ class UserLiteSerializer(BaseSerializer):
"last_name",
"email",
"avatar",
"avatar_url",
"display_name",
"email",
]

View File

@@ -8,9 +8,5 @@ class WorkspaceLiteSerializer(BaseSerializer):
class Meta:
model = Workspace
fields = [
"name",
"slug",
"id",
]
fields = ["name", "slug", "id"]
read_only_fields = fields

View File

@@ -3,7 +3,7 @@ from .state import urlpatterns as state_patterns
from .issue import urlpatterns as issue_patterns
from .cycle import urlpatterns as cycle_patterns
from .module import urlpatterns as module_patterns
from .inbox import urlpatterns as inbox_patterns
from .intake import urlpatterns as intake_patterns
from .member import urlpatterns as member_patterns
urlpatterns = [
@@ -12,6 +12,6 @@ urlpatterns = [
*issue_patterns,
*cycle_patterns,
*module_patterns,
*inbox_patterns,
*intake_patterns,
*member_patterns,
]

View File

@@ -1,17 +0,0 @@
from django.urls import path
from plane.api.views import InboxIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
InboxIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
]

View File

@@ -0,0 +1,27 @@
from django.urls import path
from plane.api.views import IntakeIssueAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/",
IntakeIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inbox-issues/<uuid:issue_id>/",
IntakeIssueAPIEndpoint.as_view(),
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/",
IntakeIssueAPIEndpoint.as_view(),
name="intake-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/intake-issues/<uuid:issue_id>/",
IntakeIssueAPIEndpoint.as_view(),
name="intake-issue",
),
]

View File

@@ -1,13 +1,11 @@
from django.urls import path
from plane.api.views import (
ProjectMemberAPIEndpoint,
)
from plane.api.views import ProjectMemberAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<str:project_id>/members/",
ProjectMemberAPIEndpoint.as_view(),
name="users",
),
)
]

View File

@@ -1,15 +1,10 @@
from django.urls import path
from plane.api.views import (
ProjectAPIEndpoint,
ProjectArchiveUnarchiveAPIEndpoint,
)
from plane.api.views import ProjectAPIEndpoint, ProjectArchiveUnarchiveAPIEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/projects/",
ProjectAPIEndpoint.as_view(),
name="project",
"workspaces/<str:slug>/projects/", ProjectAPIEndpoint.as_view(), name="project"
),
path(
"workspaces/<str:slug>/projects/<uuid:pk>/",

View File

@@ -27,5 +27,4 @@ from .module import (
from .member import ProjectMemberAPIEndpoint
from .inbox import InboxIssueAPIEndpoint
from .intake import IntakeIssueAPIEndpoint

View File

@@ -37,13 +37,9 @@ class TimezoneMixin:
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
authentication_classes = [
APIKeyAuthentication,
]
authentication_classes = [APIKeyAuthentication]
permission_classes = [
IsAuthenticated,
]
permission_classes = [IsAuthenticated]
def filter_queryset(self, queryset):
for backend in list(self.filter_backends):
@@ -56,8 +52,7 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
if api_key:
service_token = APIToken.objects.filter(
token=api_key,
is_service=True,
token=api_key, is_service=True
).first()
if service_token:
@@ -123,9 +118,7 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response
response = super().finalize_response(
request, response, *args, **kwargs
)
response = super().finalize_response(request, response, *args, **kwargs)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
@@ -154,17 +147,13 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
@property
def fields(self):
fields = [
field
for field in self.request.GET.get("fields", "").split(",")
if field
field for field in self.request.GET.get("fields", "").split(",") if field
]
return fields if fields else None
@property
def expand(self):
expand = [
expand
for expand in self.request.GET.get("expand", "").split(",")
if expand
expand for expand in self.request.GET.get("expand", "").split(",") if expand
]
return expand if expand else None
return expand if expand else None

View File

@@ -13,18 +13,19 @@ from django.db.models import (
Q,
Sum,
FloatField,
Case,
When,
Value,
)
from django.db.models.functions import Cast
from django.db.models.functions import Cast, Concat
from django.db import models
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.api.serializers import (
CycleIssueSerializer,
CycleSerializer,
)
from plane.api.serializers import CycleIssueSerializer, CycleSerializer
from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
@@ -32,7 +33,7 @@ from plane.db.models import (
CycleIssue,
Issue,
Project,
IssueAttachment,
FileAsset,
IssueLink,
ProjectMember,
UserFavorite,
@@ -53,9 +54,7 @@ class CycleAPIEndpoint(BaseAPIView):
serializer_class = CycleSerializer
model = Cycle
webhook_event = "cycle"
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
@@ -74,6 +73,7 @@ class CycleAPIEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -84,6 +84,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -94,6 +95,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -104,6 +106,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -114,6 +117,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -124,6 +128,7 @@ class CycleAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -133,26 +138,18 @@ class CycleAPIEndpoint(BaseAPIView):
def get(self, request, slug, project_id, pk=None):
if pk:
queryset = (
self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
)
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = CycleSerializer(
queryset,
fields=self.fields,
expand=self.expand,
queryset, fields=self.fields, expand=self.expand
).data
return Response(
data,
status=status.HTTP_200_OK,
)
return Response(data, status=status.HTTP_200_OK)
queryset = self.get_queryset().filter(archived_at__isnull=True)
cycle_view = request.GET.get("cycle_view", "all")
# Current Cycle
if cycle_view == "current":
queryset = queryset.filter(
start_date__lte=timezone.now(),
end_date__gte=timezone.now(),
start_date__lte=timezone.now(), end_date__gte=timezone.now()
)
data = CycleSerializer(
queryset, many=True, fields=self.fields, expand=self.expand
@@ -166,10 +163,7 @@ class CycleAPIEndpoint(BaseAPIView):
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
cycles, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -180,54 +174,38 @@ class CycleAPIEndpoint(BaseAPIView):
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
cycles, many=True, fields=self.fields, expand=self.expand
).data,
)
# Draft Cycles
if cycle_view == "draft":
queryset = queryset.filter(
end_date=None,
start_date=None,
)
queryset = queryset.filter(end_date=None, start_date=None)
return self.paginate(
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
cycles, many=True, fields=self.fields, expand=self.expand
).data,
)
# Incomplete Cycles
if cycle_view == "incomplete":
queryset = queryset.filter(
Q(end_date__gte=timezone.now().date())
| Q(end_date__isnull=True),
Q(end_date__gte=timezone.now()) | Q(end_date__isnull=True)
)
return self.paginate(
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
cycles, many=True, fields=self.fields, expand=self.expand
).data,
)
return self.paginate(
request=request,
queryset=(queryset),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
cycles, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -264,10 +242,7 @@ class CycleAPIEndpoint(BaseAPIView):
},
status=status.HTTP_409_CONFLICT,
)
serializer.save(
project_id=project_id,
owned_by=request.user,
)
serializer.save(project_id=project_id, owned_by=request.user)
# Send the model activity
model_activity.delay(
model_name="cycle",
@@ -278,12 +253,8 @@ class CycleAPIEndpoint(BaseAPIView):
slug=slug,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
return Response(
{
@@ -293,9 +264,7 @@ class CycleAPIEndpoint(BaseAPIView):
)
def patch(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
current_instance = json.dumps(
CycleSerializer(cycle).data, cls=DjangoJSONEncoder
@@ -309,16 +278,11 @@ class CycleAPIEndpoint(BaseAPIView):
request_data = request.data
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if cycle.end_date is not None and cycle.end_date < timezone.now():
if "sort_order" in request_data:
# Can only change sort order
request_data = {
"sort_order": request_data.get(
"sort_order", cycle.sort_order
)
"sort_order": request_data.get("sort_order", cycle.sort_order)
}
else:
return Response(
@@ -365,9 +329,7 @@ class CycleAPIEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
if cycle.owned_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
@@ -383,9 +345,9 @@ class CycleAPIEndpoint(BaseAPIView):
)
cycle_issues = list(
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk")
).values_list("issue", flat=True)
CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list(
"issue", flat=True
)
)
issue_activity.delay(
@@ -405,23 +367,15 @@ class CycleAPIEndpoint(BaseAPIView):
)
# Delete the cycle
cycle.delete()
# Delete the cycle issues
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk"),
).delete()
# Delete the user favorite cycle
UserFavorite.objects.filter(
entity_type="cycle",
entity_identifier=pk,
project_id=project_id,
entity_type="cycle", entity_identifier=pk, project_id=project_id
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
@@ -441,6 +395,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -451,6 +406,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -461,6 +417,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -471,6 +428,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -481,6 +439,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -491,12 +450,11 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
.annotate(
total_estimates=Sum("issue_cycle__issue__estimate_point")
)
.annotate(total_estimates=Sum("issue_cycle__issue__estimate_point"))
.annotate(
completed_estimates=Sum(
"issue_cycle__issue__estimate_point",
@@ -504,6 +462,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -514,6 +473,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -526,10 +486,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
request=request,
queryset=(self.get_queryset()),
on_results=lambda cycles: CycleSerializer(
cycles,
many=True,
fields=self.fields,
expand=self.expand,
cycles, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -537,7 +494,7 @@ class CycleArchiveUnarchiveAPIEndpoint(BaseAPIView):
cycle = Cycle.objects.get(
pk=cycle_id, project_id=project_id, workspace__slug=slug
)
if cycle.end_date >= timezone.now().date():
if cycle.end_date >= timezone.now():
return Response(
{"error": "Only completed cycles can be archived"},
status=status.HTTP_400_BAD_REQUEST,
@@ -572,16 +529,12 @@ class CycleIssueAPIEndpoint(BaseAPIView):
model = CycleIssue
webhook_event = "cycle_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
CycleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("issue_id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -619,11 +572,11 @@ class CycleIssueAPIEndpoint(BaseAPIView):
# List
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id, issue_cycle__deleted_at__isnull=True
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -645,8 +598,9 @@ class CycleIssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -658,10 +612,7 @@ class CycleIssueAPIEndpoint(BaseAPIView):
request=request,
queryset=(issues),
on_results=lambda issues: CycleSerializer(
issues,
many=True,
fields=self.fields,
expand=self.expand,
issues, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -670,8 +621,7 @@ class CycleIssueAPIEndpoint(BaseAPIView):
if not issues:
return Response(
{"error": "Issues are required"},
status=status.HTTP_400_BAD_REQUEST,
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
)
cycle = Cycle.objects.get(
@@ -680,9 +630,7 @@ class CycleIssueAPIEndpoint(BaseAPIView):
# Get all CycleIssues already created
cycle_issues = list(
CycleIssue.objects.filter(
~Q(cycle_id=cycle_id), issue_id__in=issues
)
CycleIssue.objects.filter(~Q(cycle_id=cycle_id), issue_id__in=issues)
)
existing_issues = [
@@ -727,9 +675,7 @@ class CycleIssueAPIEndpoint(BaseAPIView):
)
# Update the cycle issues
CycleIssue.objects.bulk_update(
updated_records, ["cycle_id"], batch_size=100
)
CycleIssue.objects.bulk_update(updated_records, ["cycle_id"], batch_size=100)
# Capture Issue Activity
issue_activity.delay(
@@ -788,9 +734,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
"""
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def post(self, request, slug, project_id, cycle_id):
new_cycle_id = request.data.get("new_cycle_id", False)
@@ -815,6 +759,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -825,6 +770,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -835,6 +781,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -845,6 +792,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -855,6 +803,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -865,6 +814,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
issue_cycle__deleted_at__isnull=True,
),
)
)
@@ -881,18 +831,37 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
assignee_estimate_data = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True,
then="assignees__avatar",
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar", "avatar_url")
.annotate(
total_estimates=Sum(Cast("estimate_point__value", FloatField()))
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", FloatField()),
@@ -920,11 +889,10 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
{
"display_name": item["display_name"],
"assignee_id": (
str(item["assignee_id"])
if item["assignee_id"]
else None
str(item["assignee_id"]) if item["assignee_id"] else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar", None),
"avatar_url": item.get("avatar_url", None),
"total_estimates": item["total_estimates"],
"completed_estimates": item["completed_estimates"],
"pending_estimates": item["pending_estimates"],
@@ -935,6 +903,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
label_distribution_data = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -943,9 +912,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
.annotate(label_id=F("labels__id"))
.values("label_name", "color", "label_id")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", FloatField())
)
total_estimates=Sum(Cast("estimate_point__value", FloatField()))
)
.annotate(
completed_estimates=Sum(
@@ -982,9 +949,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
{
"label_name": item["label_name"],
"color": item["color"],
"label_id": (
str(item["label_id"]) if item["label_id"] else None
),
"label_id": (str(item["label_id"]) if item["label_id"] else None),
"total_estimates": item["total_estimates"],
"completed_estimates": item["completed_estimates"],
"pending_estimates": item["pending_estimates"],
@@ -996,21 +961,37 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
assignee_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
avatar_url=Case(
# If `avatar_asset` exists, use it to generate the asset URL
When(
assignees__avatar_asset__isnull=False,
then=Concat(
Value("/api/assets/v2/static/"),
"assignees__avatar_asset", # Assuming avatar_asset has an id or relevant field
Value("/"),
),
),
# If `avatar_asset` is None, fall back to using `avatar` field directly
When(
assignees__avatar_asset__isnull=True, then="assignees__avatar"
),
default=Value(None),
output_field=models.CharField(),
)
)
.values("display_name", "assignee_id", "avatar_url")
.annotate(
total_issues=Count(
"id",
filter=Q(
archived_at__isnull=True,
is_draft=False,
),
),
"id", filter=Q(archived_at__isnull=True, is_draft=False)
)
)
.annotate(
completed_issues=Count(
@@ -1041,7 +1022,8 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
"assignee_id": (
str(item["assignee_id"]) if item["assignee_id"] else None
),
"avatar": item["avatar"],
"avatar": item.get("avatar", None),
"avatar_url": item.get("avatar_url", None),
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
@@ -1053,6 +1035,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
label_distribution = (
Issue.issue_objects.filter(
issue_cycle__cycle_id=cycle_id,
issue_cycle__deleted_at__isnull=True,
workspace__slug=slug,
project_id=project_id,
)
@@ -1062,12 +1045,8 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
.values("label_name", "color", "label_id")
.annotate(
total_issues=Count(
"id",
filter=Q(
archived_at__isnull=True,
is_draft=False,
),
),
"id", filter=Q(archived_at__isnull=True, is_draft=False)
)
)
.annotate(
completed_issues=Count(
@@ -1097,9 +1076,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
{
"label_name": item["label_name"],
"color": item["color"],
"label_id": (
str(item["label_id"]) if item["label_id"] else None
),
"label_id": (str(item["label_id"]) if item["label_id"] else None),
"total_issues": item["total_issues"],
"completed_issues": item["completed_issues"],
"pending_issues": item["pending_issues"],
@@ -1144,10 +1121,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
}
current_cycle.save(update_fields=["progress_snapshot"])
if (
new_cycle.end_date is not None
and new_cycle.end_date < timezone.now().date()
):
if new_cycle.end_date is not None and new_cycle.end_date < timezone.now():
return Response(
{
"error": "The cycle where the issues are transferred is already completed"

View File

@@ -14,60 +14,47 @@ from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.api.serializers import IntakeIssueSerializer, IssueSerializer
from plane.app.permissions import ProjectLitePermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Inbox,
InboxIssue,
Issue,
Project,
ProjectMember,
State,
)
from plane.db.models import Intake, IntakeIssue, Issue, Project, ProjectMember, State
from .base import BaseAPIView
class InboxIssueAPIEndpoint(BaseAPIView):
class IntakeIssueAPIEndpoint(BaseAPIView):
"""
This viewset automatically provides `list`, `create`, `retrieve`,
`update` and `destroy` actions related to inbox issues.
`update` and `destroy` actions related to intake issues.
"""
permission_classes = [
ProjectLitePermission,
]
permission_classes = [ProjectLitePermission]
serializer_class = InboxIssueSerializer
model = InboxIssue
serializer_class = IntakeIssueSerializer
model = IntakeIssue
filterset_fields = [
"status",
]
filterset_fields = ["status"]
def get_queryset(self):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
).first()
project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"),
pk=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
)
if inbox is None and not project.inbox_view:
return InboxIssue.objects.none()
if intake is None and not project.intake_view:
return IntakeIssue.objects.none()
return (
InboxIssue.objects.filter(
Q(snoozed_till__gte=timezone.now())
| Q(snoozed_till__isnull=True),
IntakeIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=inbox.id,
intake_id=intake.id,
)
.select_related("issue", "workspace", "project")
.order_by(self.kwargs.get("order_by", "-created_at"))
@@ -75,49 +62,37 @@ class InboxIssueAPIEndpoint(BaseAPIView):
def get(self, request, slug, project_id, issue_id=None):
if issue_id:
inbox_issue_queryset = self.get_queryset().get(issue_id=issue_id)
inbox_issue_data = InboxIssueSerializer(
inbox_issue_queryset,
fields=self.fields,
expand=self.expand,
intake_issue_queryset = self.get_queryset().get(issue_id=issue_id)
intake_issue_data = IntakeIssueSerializer(
intake_issue_queryset, fields=self.fields, expand=self.expand
).data
return Response(
inbox_issue_data,
status=status.HTTP_200_OK,
)
return Response(intake_issue_data, status=status.HTTP_200_OK)
issue_queryset = self.get_queryset()
return self.paginate(
request=request,
queryset=(issue_queryset),
on_results=lambda inbox_issues: InboxIssueSerializer(
inbox_issues,
many=True,
fields=self.fields,
expand=self.expand,
on_results=lambda intake_issues: IntakeIssueSerializer(
intake_issues, many=True, fields=self.fields, expand=self.expand
).data,
)
def post(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"},
status=status.HTTP_400_BAD_REQUEST,
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
project = Project.objects.get(workspace__slug=slug, pk=project_id)
# Inbox view
if inbox is None and not project.inbox_view:
# Intake view
if intake is None and not project.intake_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
"error": "Intake is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -131,15 +106,14 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"none",
]:
return Response(
{"error": "Invalid priority"},
status=status.HTTP_400_BAD_REQUEST,
{"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
)
# Create or get state
state, _ = State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Inbox Issues",
description="Default state for managing all Intake Issues",
project_id=project_id,
color="#ff7700",
is_triage=True,
@@ -157,12 +131,12 @@ class InboxIssueAPIEndpoint(BaseAPIView):
state=state,
)
# create an inbox issue
inbox_issue = InboxIssue.objects.create(
inbox_id=inbox.id,
# create an intake issue
intake_issue = IntakeIssue.objects.create(
intake_id=intake.id,
project_id=project_id,
issue=issue,
source=request.data.get("source", "in-app"),
source=request.data.get("source", "IN-APP"),
)
# Create an Issue Activity
issue_activity.delay(
@@ -173,32 +147,34 @@ class InboxIssueAPIEndpoint(BaseAPIView):
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
inbox=str(inbox_issue.id),
intake=str(intake_issue.id),
)
serializer = InboxIssueSerializer(inbox_issue)
serializer = IntakeIssueSerializer(intake_issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
# Inbox view
if inbox is None:
project = Project.objects.get(workspace__slug=slug, pk=project_id)
# Intake view
if intake is None and not project.intake_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
"error": "Intake is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
# Get the intake issue
intake_issue = IntakeIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
intake_id=intake.id,
)
# Get the project member
@@ -210,11 +186,11 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
# Only project members admins and created_by users can access this endpoint
if project_member.role <= 5 and str(inbox_issue.created_by_id) != str(
if project_member.role <= 5 and str(intake_issue.created_by_id) != str(
request.user.id
):
return Response(
{"error": "You cannot edit inbox issues"},
{"error": "You cannot edit intake issues"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -227,7 +203,10 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
filter=Q(
~Q(labels__id__isnull=True)
& Q(label_issue__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
@@ -235,15 +214,15 @@ class InboxIssueAPIEndpoint(BaseAPIView):
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True),
filter=Q(
~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True)
& Q(issue_assignee__deleted_at__isnull=True)
),
),
Value([], output_field=ArrayField(UUIDField())),
),
).get(
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
)
).get(pk=issue_id, workspace__slug=slug, project_id=project_id)
# Only allow guests to edit name and description
if project_member.role <= 5:
issue_data = {
@@ -251,14 +230,10 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get(
"description", issue.description
),
"description": issue_data.get("description", issue.description),
}
issue_serializer = IssueSerializer(
issue, data=issue_data, partial=True
)
issue_serializer = IssueSerializer(issue, data=issue_data, partial=True)
if issue_serializer.is_valid():
current_instance = issue
@@ -276,7 +251,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
cls=DjangoJSONEncoder,
),
epoch=int(timezone.now().timestamp()),
inbox=(inbox_issue.id),
intake=(intake_issue.id),
)
issue_serializer.save()
else:
@@ -284,13 +259,13 @@ class InboxIssueAPIEndpoint(BaseAPIView):
issue_serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
# Only project admins and members can edit inbox issue attributes
if project_member.role > 5:
serializer = InboxIssueSerializer(
inbox_issue, data=request.data, partial=True
# Only project admins and members can edit intake issue attributes
if project_member.role > 15:
serializer = IntakeIssueSerializer(
intake_issue, data=request.data, partial=True
)
current_instance = json.dumps(
InboxIssueSerializer(inbox_issue).data, cls=DjangoJSONEncoder
IntakeIssueSerializer(intake_issue).data, cls=DjangoJSONEncoder
)
if serializer.is_valid():
@@ -298,14 +273,10 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# Update the issue state if the issue is rejected or marked as duplicate
if serializer.data["status"] in [-1, 2]:
issue = Issue.objects.get(
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
pk=issue_id, workspace__slug=slug, project_id=project_id
)
state = State.objects.filter(
group="cancelled",
workspace__slug=slug,
project_id=project_id,
group="cancelled", workspace__slug=slug, project_id=project_id
).first()
if state is not None:
issue.state = state
@@ -314,18 +285,14 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# Update the issue state if it is accepted
if serializer.data["status"] in [1]:
issue = Issue.objects.get(
pk=issue_id,
workspace__slug=slug,
project_id=project_id,
pk=issue_id, workspace__slug=slug, project_id=project_id
)
# Update the issue state only if it is in triage state
if issue.state.is_triage:
# Move to default state
state = State.objects.filter(
workspace__slug=slug,
project_id=project_id,
default=True,
workspace__slug=slug, project_id=project_id, default=True
).first()
if state is not None:
issue.state = state
@@ -333,10 +300,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
# create a activity for status change
issue_activity.delay(
type="inbox.activity.created",
requested_data=json.dumps(
request.data, cls=DjangoJSONEncoder
),
type="intake.activity.created",
requested_data=json.dumps(request.data, cls=DjangoJSONEncoder),
actor_id=str(request.user.id),
issue_id=str(issue_id),
project_id=str(project_id),
@@ -344,48 +309,42 @@ class InboxIssueAPIEndpoint(BaseAPIView):
epoch=int(timezone.now().timestamp()),
notification=False,
origin=request.META.get("HTTP_ORIGIN"),
inbox=str(inbox_issue.id),
intake=str(intake_issue.id),
)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
return Response(
InboxIssueSerializer(inbox_issue).data,
status=status.HTTP_200_OK,
IntakeIssueSerializer(intake_issue).data, status=status.HTTP_200_OK
)
def delete(self, request, slug, project_id, issue_id):
inbox = Inbox.objects.filter(
intake = Intake.objects.filter(
workspace__slug=slug, project_id=project_id
).first()
project = Project.objects.get(
workspace__slug=slug,
pk=project_id,
)
project = Project.objects.get(workspace__slug=slug, pk=project_id)
# Inbox view
if inbox is None and not project.inbox_view:
# Intake view
if intake is None and not project.intake_view:
return Response(
{
"error": "Inbox is not enabled for this project enable it through the project's api"
"error": "Intake is not enabled for this project enable it through the project's api"
},
status=status.HTTP_400_BAD_REQUEST,
)
# Get the inbox issue
inbox_issue = InboxIssue.objects.get(
# Get the intake issue
intake_issue = IntakeIssue.objects.get(
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox.id,
intake_id=intake.id,
)
# Check the issue status
if inbox_issue.status in [-2, -1, 0, 2]:
if intake_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
issue = Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=issue_id
@@ -405,5 +364,5 @@ class InboxIssueAPIEndpoint(BaseAPIView):
)
issue.delete()
inbox_issue.delete()
intake_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -16,6 +16,7 @@ from django.db.models import (
Q,
Value,
When,
Subquery,
)
from django.utils import timezone
@@ -42,12 +43,13 @@ from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueActivity,
IssueAttachment,
FileAsset,
IssueComment,
IssueLink,
Label,
Project,
ProjectMember,
CycleIssue,
)
from .base import BaseAPIView
@@ -71,9 +73,7 @@ class WorkspaceIssueAPIEndpoint(BaseAPIView):
def get_queryset(self):
return (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -89,14 +89,10 @@ class WorkspaceIssueAPIEndpoint(BaseAPIView):
.order_by(self.kwargs.get("order_by", "-created_at"))
).distinct()
def get(
self, request, slug, project__identifier=None, issue__identifier=None
):
def get(self, request, slug, project__identifier=None, issue__identifier=None):
if issue__identifier and project__identifier:
issue = Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -106,11 +102,7 @@ class WorkspaceIssueAPIEndpoint(BaseAPIView):
sequence_id=issue__identifier,
)
return Response(
IssueSerializer(
issue,
fields=self.fields,
expand=self.expand,
).data,
IssueSerializer(issue, fields=self.fields, expand=self.expand).data,
status=status.HTTP_200_OK,
)
@@ -124,17 +116,13 @@ class IssueAPIEndpoint(BaseAPIView):
model = Issue
webhook_event = "issue"
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
serializer_class = IssueSerializer
def get_queryset(self):
return (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -162,47 +150,37 @@ class IssueAPIEndpoint(BaseAPIView):
project_id=project_id,
)
return Response(
IssueSerializer(
issue,
fields=self.fields,
expand=self.expand,
).data,
IssueSerializer(issue, fields=self.fields, expand=self.expand).data,
status=status.HTTP_200_OK,
)
if pk:
issue = Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
).get(workspace__slug=slug, project_id=project_id, pk=pk)
return Response(
IssueSerializer(
issue,
fields=self.fields,
expand=self.expand,
).data,
IssueSerializer(issue, fields=self.fields, expand=self.expand).data,
status=status.HTTP_200_OK,
)
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
self.get_queryset()
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
cycle_id=Subquery(
CycleIssue.objects.filter(
issue=OuterRef("id"), deleted_at__isnull=True
).values("cycle_id")[:1]
)
)
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -210,8 +188,9 @@ class IssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -222,9 +201,7 @@ class IssueAPIEndpoint(BaseAPIView):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
priority_order if order_by_param == "priority" else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -272,9 +249,7 @@ class IssueAPIEndpoint(BaseAPIView):
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
"-max_values" if order_by_param.startswith("-") else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
@@ -283,10 +258,7 @@ class IssueAPIEndpoint(BaseAPIView):
request=request,
queryset=(issue_queryset),
on_results=lambda issues: IssueSerializer(
issues,
many=True,
fields=self.fields,
expand=self.expand,
issues, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -330,22 +302,16 @@ class IssueAPIEndpoint(BaseAPIView):
serializer.save()
# Refetch the issue
issue = Issue.objects.filter(
workspace__slug=slug,
project_id=project_id,
pk=serializer.data["id"],
workspace__slug=slug, project_id=project_id, pk=serializer.data["id"]
).first()
issue.created_at = request.data.get("created_at", timezone.now())
issue.created_by_id = request.data.get(
"created_by", request.user.id
)
issue.created_by_id = request.data.get("created_by", request.user.id)
issue.save(update_fields=["created_at", "created_by"])
# Track the issue
issue_activity.delay(
type="issue.activity.created",
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
requested_data=json.dumps(self.request.data, cls=DjangoJSONEncoder),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
@@ -382,9 +348,7 @@ class IssueAPIEndpoint(BaseAPIView):
# Get the requested data, encode it as django object and pass it
# to serializer to validation
requested_data = json.dumps(
self.request.data, cls=DjangoJSONEncoder
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
serializer = IssueSerializer(
issue,
data=request.data,
@@ -442,9 +406,7 @@ class IssueAPIEndpoint(BaseAPIView):
# If any of the created_at or created_by is present, update
# the issue with the provided data, else return with the
# default states given.
issue.created_at = request.data.get(
"created_at", timezone.now()
)
issue.created_at = request.data.get("created_at", timezone.now())
issue.created_by_id = request.data.get(
"created_by", request.user.id
)
@@ -461,12 +423,8 @@ class IssueAPIEndpoint(BaseAPIView):
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
else:
return Response(
{"error": "external_id and external_source are required"},
@@ -474,9 +432,7 @@ class IssueAPIEndpoint(BaseAPIView):
)
def patch(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
project = Project.objects.get(pk=project_id)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
@@ -485,10 +441,7 @@ class IssueAPIEndpoint(BaseAPIView):
serializer = IssueSerializer(
issue,
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
},
context={"project_id": project_id, "workspace_id": project.workspace_id},
partial=True,
)
if serializer.is_valid():
@@ -526,9 +479,7 @@ class IssueAPIEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
if issue.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
@@ -567,9 +518,7 @@ class LabelAPIEndpoint(BaseAPIView):
serializer_class = LabelSerializer
model = Label
permission_classes = [
ProjectMemberPermission,
]
permission_classes = [ProjectMemberPermission]
def get_queryset(self):
return (
@@ -616,12 +565,8 @@ class LabelAPIEndpoint(BaseAPIView):
)
serializer.save(project_id=project_id)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError:
label = Label.objects.filter(
workspace__slug=slug,
@@ -642,18 +587,11 @@ class LabelAPIEndpoint(BaseAPIView):
request=request,
queryset=(self.get_queryset()),
on_results=lambda labels: LabelSerializer(
labels,
many=True,
fields=self.fields,
expand=self.expand,
labels, many=True, fields=self.fields, expand=self.expand
).data,
)
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(
label,
fields=self.fields,
expand=self.expand,
)
serializer = LabelSerializer(label, fields=self.fields, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, pk=None):
@@ -696,9 +634,7 @@ class IssueLinkAPIEndpoint(BaseAPIView):
"""
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
model = IssueLink
serializer_class = IssueLinkSerializer
@@ -721,46 +657,32 @@ class IssueLinkAPIEndpoint(BaseAPIView):
if pk is None:
issue_links = self.get_queryset()
serializer = IssueLinkSerializer(
issue_links,
fields=self.fields,
expand=self.expand,
issue_links, fields=self.fields, expand=self.expand
)
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda issue_links: IssueLinkSerializer(
issue_links,
many=True,
fields=self.fields,
expand=self.expand,
issue_links, many=True, fields=self.fields, expand=self.expand
).data,
)
issue_link = self.get_queryset().get(pk=pk)
serializer = IssueLinkSerializer(
issue_link,
fields=self.fields,
expand=self.expand,
issue_link, fields=self.fields, expand=self.expand
)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug, project_id, issue_id):
serializer = IssueLinkSerializer(data=request.data)
if serializer.is_valid():
serializer.save(
project_id=project_id,
issue_id=issue_id,
)
serializer.save(project_id=project_id, issue_id=issue_id)
link = IssueLink.objects.get(pk=serializer.data["id"])
link.created_by_id = request.data.get(
"created_by", request.user.id
)
link.created_by_id = request.data.get("created_by", request.user.id)
link.save(update_fields=["created_by"])
issue_activity.delay(
type="link.activity.created",
requested_data=json.dumps(
serializer.data, cls=DjangoJSONEncoder
),
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
actor_id=str(link.created_by_id),
@@ -772,19 +694,13 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
)
requested_data = json.dumps(request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
cls=DjangoJSONEncoder,
)
serializer = IssueLinkSerializer(
issue_link, data=request.data, partial=True
IssueLinkSerializer(issue_link).data, cls=DjangoJSONEncoder
)
serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
@@ -801,14 +717,10 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
cls=DjangoJSONEncoder,
IssueLinkSerializer(issue_link).data, cls=DjangoJSONEncoder
)
issue_activity.delay(
type="link.activity.deleted",
@@ -833,15 +745,11 @@ class IssueCommentAPIEndpoint(BaseAPIView):
serializer_class = IssueCommentSerializer
model = IssueComment
webhook_event = "issue_comment"
permission_classes = [
ProjectLitePermission,
]
permission_classes = [ProjectLitePermission]
def get_queryset(self):
return (
IssueComment.objects.filter(
workspace__slug=self.kwargs.get("slug")
)
IssueComment.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project_id=self.kwargs.get("project_id"))
.filter(issue_id=self.kwargs.get("issue_id"))
.filter(
@@ -868,19 +776,14 @@ class IssueCommentAPIEndpoint(BaseAPIView):
if pk:
issue_comment = self.get_queryset().get(pk=pk)
serializer = IssueCommentSerializer(
issue_comment,
fields=self.fields,
expand=self.expand,
issue_comment, fields=self.fields, expand=self.expand
)
return Response(serializer.data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset()),
on_results=lambda issue_comment: IssueCommentSerializer(
issue_comment,
many=True,
fields=self.fields,
expand=self.expand,
issue_comment, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -913,17 +816,11 @@ class IssueCommentAPIEndpoint(BaseAPIView):
serializer = IssueCommentSerializer(data=request.data)
if serializer.is_valid():
serializer.save(
project_id=project_id,
issue_id=issue_id,
actor=request.user,
)
issue_comment = IssueComment.objects.get(
pk=serializer.data.get("id")
project_id=project_id, issue_id=issue_id, actor=request.user
)
issue_comment = IssueComment.objects.get(pk=serializer.data.get("id"))
# Update the created_at and the created_by and save the comment
issue_comment.created_at = request.data.get(
"created_at", timezone.now()
)
issue_comment.created_at = request.data.get("created_at", timezone.now())
issue_comment.created_by_id = request.data.get(
"created_by", request.user.id
)
@@ -931,9 +828,7 @@ class IssueCommentAPIEndpoint(BaseAPIView):
issue_activity.delay(
type="comment.activity.created",
requested_data=json.dumps(
serializer.data, cls=DjangoJSONEncoder
),
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
actor_id=str(issue_comment.created_by_id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -945,24 +840,17 @@ class IssueCommentAPIEndpoint(BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data,
cls=DjangoJSONEncoder,
IssueCommentSerializer(issue_comment).data, cls=DjangoJSONEncoder
)
# Validation check if the issue already exists
if (
request.data.get("external_id")
and (
issue_comment.external_id
!= str(request.data.get("external_id"))
)
and (issue_comment.external_id != str(request.data.get("external_id")))
and IssueComment.objects.filter(
project_id=project_id,
workspace__slug=slug,
@@ -999,14 +887,10 @@ class IssueCommentAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
)
current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data,
cls=DjangoJSONEncoder,
IssueCommentSerializer(issue_comment).data, cls=DjangoJSONEncoder
)
issue_comment.delete()
issue_activity.delay(
@@ -1022,9 +906,7 @@ class IssueCommentAPIEndpoint(BaseAPIView):
class IssueActivityAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get(self, request, slug, project_id, issue_id, pk=None):
issue_activities = (
@@ -1049,20 +931,15 @@ class IssueActivityAPIEndpoint(BaseAPIView):
request=request,
queryset=(issue_activities),
on_results=lambda issue_activity: IssueActivitySerializer(
issue_activity,
many=True,
fields=self.fields,
expand=self.expand,
issue_activity, many=True, fields=self.fields, expand=self.expand
).data,
)
class IssueAttachmentEndpoint(BaseAPIView):
serializer_class = IssueAttachmentSerializer
permission_classes = [
ProjectEntityPermission,
]
model = IssueAttachment
permission_classes = [ProjectEntityPermission]
model = FileAsset
parser_classes = (MultiPartParser, FormParser)
def post(self, request, slug, project_id, issue_id):
@@ -1070,7 +947,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
if (
request.data.get("external_id")
and request.data.get("external_source")
and IssueAttachment.objects.filter(
and FileAsset.objects.filter(
project_id=project_id,
workspace__slug=slug,
issue_id=issue_id,
@@ -1078,7 +955,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
external_id=request.data.get("external_id"),
).exists()
):
issue_attachment = IssueAttachment.objects.filter(
issue_attachment = FileAsset.objects.filter(
workspace__slug=slug,
project_id=project_id,
external_id=request.data.get("external_id"),
@@ -1100,10 +977,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id", None)),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
serializer.data,
cls=DjangoJSONEncoder,
),
current_instance=json.dumps(serializer.data, cls=DjangoJSONEncoder),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
@@ -1112,7 +986,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, issue_id, pk):
issue_attachment = IssueAttachment.objects.get(pk=pk)
issue_attachment = FileAsset.objects.get(pk=pk)
issue_attachment.asset.delete(save=False)
issue_attachment.delete()
issue_activity.delay(
@@ -1130,7 +1004,7 @@ class IssueAttachmentEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def get(self, request, slug, project_id, issue_id):
issue_attachments = IssueAttachment.objects.filter(
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueAttachmentSerializer(issue_attachments, many=True)

View File

@@ -13,24 +13,14 @@ from rest_framework import status
# Module imports
from .base import BaseAPIView
from plane.api.serializers import UserLiteSerializer
from plane.db.models import (
User,
Workspace,
Project,
WorkspaceMember,
ProjectMember,
)
from plane.db.models import User, Workspace, Project, WorkspaceMember, ProjectMember
from plane.app.permissions import (
ProjectMemberPermission,
)
from plane.app.permissions import ProjectMemberPermission
# API endpoint to get and insert users inside the workspace
class ProjectMemberAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectMemberPermission,
]
permission_classes = [ProjectMemberPermission]
# Get all the users that are present inside the workspace
def get(self, request, slug, project_id):
@@ -48,10 +38,7 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
# Get all the users that are present inside the workspace
users = UserLiteSerializer(
User.objects.filter(
id__in=project_members,
),
many=True,
User.objects.filter(id__in=project_members), many=True
).data
return Response(users, status=status.HTTP_200_OK)
@@ -78,8 +65,7 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
validate_email(email)
except ValidationError:
return Response(
{"error": "Invalid email provided"},
status=status.HTTP_400_BAD_REQUEST,
{"error": "Invalid email provided"}, status=status.HTTP_400_BAD_REQUEST
)
workspace = Workspace.objects.filter(slug=slug).first()
@@ -108,9 +94,7 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
).first()
if project_member:
return Response(
{
"error": "User is already part of the workspace and project"
},
{"error": "User is already part of the workspace and project"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -131,18 +115,14 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
# Create a workspace member for the user if not already a member
if not workspace_member:
workspace_member = WorkspaceMember.objects.create(
workspace=workspace,
member=user,
role=request.data.get("role", 5),
workspace=workspace, member=user, role=request.data.get("role", 5)
)
workspace_member.save()
# Create a project member for the user if not already a member
if not project_member:
project_member = ProjectMember.objects.create(
project=project,
member=user,
role=request.data.get("role", 5),
project=project, member=user, role=request.data.get("role", 5)
)
project_member.save()
@@ -150,4 +130,3 @@ class ProjectMemberAPIEndpoint(BaseAPIView):
user_data = UserLiteSerializer(user).data
return Response(user_data, status=status.HTTP_201_CREATED)

View File

@@ -21,7 +21,7 @@ from plane.app.permissions import ProjectEntityPermission
from plane.bgtasks.issue_activities_task import issue_activity
from plane.db.models import (
Issue,
IssueAttachment,
FileAsset,
IssueLink,
Module,
ModuleIssue,
@@ -43,9 +43,7 @@ class ModuleAPIEndpoint(BaseAPIView):
"""
model = Module
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
serializer_class = ModuleSerializer
webhook_event = "module"
@@ -60,9 +58,7 @@ class ModuleAPIEndpoint(BaseAPIView):
.prefetch_related(
Prefetch(
"link_module",
queryset=ModuleLink.objects.select_related(
"module", "created_by"
),
queryset=ModuleLink.objects.select_related("module", "created_by"),
)
)
.annotate(
@@ -71,9 +67,10 @@ class ModuleAPIEndpoint(BaseAPIView):
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
),
)
)
.annotate(
completed_issues=Count(
@@ -82,6 +79,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -93,6 +91,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -104,6 +103,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -115,6 +115,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -126,6 +127,7 @@ class ModuleAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -137,10 +139,7 @@ class ModuleAPIEndpoint(BaseAPIView):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
serializer = ModuleSerializer(
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
},
context={"project_id": project_id, "workspace_id": project.workspace_id},
)
if serializer.is_valid():
if (
@@ -183,9 +182,7 @@ class ModuleAPIEndpoint(BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk):
module = Module.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
current_instance = json.dumps(
ModuleSerializer(module).data, cls=DjangoJSONEncoder
@@ -197,10 +194,7 @@ class ModuleAPIEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
serializer = ModuleSerializer(
module,
data=request.data,
context={"project_id": project_id},
partial=True,
module, data=request.data, context={"project_id": project_id}, partial=True
)
if serializer.is_valid():
if (
@@ -240,33 +234,21 @@ class ModuleAPIEndpoint(BaseAPIView):
def get(self, request, slug, project_id, pk=None):
if pk:
queryset = (
self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
)
queryset = self.get_queryset().filter(archived_at__isnull=True).get(pk=pk)
data = ModuleSerializer(
queryset,
fields=self.fields,
expand=self.expand,
queryset, fields=self.fields, expand=self.expand
).data
return Response(
data,
status=status.HTTP_200_OK,
)
return Response(data, status=status.HTTP_200_OK)
return self.paginate(
request=request,
queryset=(self.get_queryset().filter(archived_at__isnull=True)),
on_results=lambda modules: ModuleSerializer(
modules,
many=True,
fields=self.fields,
expand=self.expand,
modules, many=True, fields=self.fields, expand=self.expand
).data,
)
def delete(self, request, slug, project_id, pk):
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
if module.created_by_id != request.user.id and (
not ProjectMember.objects.filter(
workspace__slug=slug,
@@ -282,9 +264,7 @@ class ModuleAPIEndpoint(BaseAPIView):
)
module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list(
"issue", flat=True
)
ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
)
issue_activity.delay(
type="module.activity.deleted",
@@ -298,24 +278,15 @@ class ModuleAPIEndpoint(BaseAPIView):
actor_id=str(request.user.id),
issue_id=None,
project_id=str(project_id),
current_instance=json.dumps(
{
"module_name": str(module.name),
}
),
current_instance=json.dumps({"module_name": str(module.name)}),
epoch=int(timezone.now().timestamp()),
)
module.delete()
# Delete the module issues
ModuleIssue.objects.filter(
module=pk,
project_id=project_id,
).delete()
ModuleIssue.objects.filter(module=pk, project_id=project_id).delete()
# Delete the user favorite module
UserFavorite.objects.filter(
entity_type="module",
entity_identifier=pk,
project_id=project_id,
entity_type="module", entity_identifier=pk, project_id=project_id
).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
@@ -332,16 +303,12 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
webhook_event = "module_issue"
bulk = True
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
ModuleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("issue")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -367,11 +334,11 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
def get(self, request, slug, project_id, module_id):
order_by = request.GET.get("order_by", "created_at")
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
Issue.issue_objects.filter(
issue_module__module_id=module_id, issue_module__deleted_at__isnull=True
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -393,8 +360,9 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
attachment_count=FileAsset.objects.filter(
issue_id=OuterRef("id"),
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -405,10 +373,7 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
request=request,
queryset=(issues),
on_results=lambda issues: IssueSerializer(
issues,
many=True,
fields=self.fields,
expand=self.expand,
issues, many=True, fields=self.fields, expand=self.expand
).data,
)
@@ -416,8 +381,7 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
issues = request.data.get("issues", [])
if not len(issues):
return Response(
{"error": "Issues are required"},
status=status.HTTP_400_BAD_REQUEST,
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
)
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=module_id
@@ -464,16 +428,10 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
)
ModuleIssue.objects.bulk_create(
record_to_create,
batch_size=10,
ignore_conflicts=True,
record_to_create, batch_size=10, ignore_conflicts=True
)
ModuleIssue.objects.bulk_update(
records_to_update,
["module"],
batch_size=10,
)
ModuleIssue.objects.bulk_update(records_to_update, ["module"], batch_size=10)
# Capture Issue Activity
issue_activity.delay(
@@ -509,10 +467,7 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
issue_activity.delay(
type="module.activity.deleted",
requested_data=json.dumps(
{
"module_id": str(module_id),
"issues": [str(module_issue.issue_id)],
}
{"module_id": str(module_id), "issues": [str(module_issue.issue_id)]}
),
actor_id=str(request.user.id),
issue_id=str(issue_id),
@@ -524,9 +479,7 @@ class ModuleIssueAPIEndpoint(BaseAPIView):
class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
@@ -540,9 +493,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
.prefetch_related(
Prefetch(
"link_module",
queryset=ModuleLink.objects.select_related(
"module", "created_by"
),
queryset=ModuleLink.objects.select_related("module", "created_by"),
)
)
.annotate(
@@ -551,9 +502,10 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
filter=Q(
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
),
)
)
.annotate(
completed_issues=Count(
@@ -562,6 +514,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="completed",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -573,6 +526,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="cancelled",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -584,6 +538,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="started",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -595,6 +550,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="unstarted",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -606,6 +562,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
issue_module__issue__state__group="backlog",
issue_module__issue__archived_at__isnull=True,
issue_module__issue__is_draft=False,
issue_module__deleted_at__isnull=True,
),
distinct=True,
)
@@ -618,22 +575,15 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
request=request,
queryset=(self.get_queryset()),
on_results=lambda modules: ModuleSerializer(
modules,
many=True,
fields=self.fields,
expand=self.expand,
modules, many=True, fields=self.fields, expand=self.expand
).data,
)
def post(self, request, slug, project_id, pk):
module = Module.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
if module.status not in ["completed", "cancelled"]:
return Response(
{
"error": "Only completed or cancelled modules can be archived"
},
{"error": "Only completed or cancelled modules can be archived"},
status=status.HTTP_400_BAD_REQUEST,
)
module.archived_at = timezone.now()
@@ -647,9 +597,7 @@ class ModuleArchiveUnarchiveAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id, pk):
module = Module.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
module.archived_at = None
module.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -18,7 +18,7 @@ from plane.app.permissions import ProjectBasePermission
# Module imports
from plane.db.models import (
Cycle,
Inbox,
Intake,
IssueUserProperty,
Module,
Project,
@@ -39,9 +39,7 @@ class ProjectAPIEndpoint(BaseAPIView):
model = Project
webhook_event = "project"
permission_classes = [
ProjectBasePermission,
]
permission_classes = [ProjectBasePermission]
def get_queryset(self):
return (
@@ -54,10 +52,7 @@ class ProjectAPIEndpoint(BaseAPIView):
| Q(network=2)
)
.select_related(
"workspace",
"workspace__owner",
"default_assignee",
"project_lead",
"workspace", "workspace__owner", "default_assignee", "project_lead"
)
.annotate(
is_member=Exists(
@@ -71,9 +66,7 @@ class ProjectAPIEndpoint(BaseAPIView):
)
.annotate(
total_members=ProjectMember.objects.filter(
project_id=OuterRef("id"),
member__is_bot=False,
is_active=True,
project_id=OuterRef("id"), member__is_bot=False, is_active=True
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
@@ -125,8 +118,7 @@ class ProjectAPIEndpoint(BaseAPIView):
Prefetch(
"project_projectmember",
queryset=ProjectMember.objects.filter(
workspace__slug=slug,
is_active=True,
workspace__slug=slug, is_active=True
).select_related("member"),
)
)
@@ -136,18 +128,11 @@ class ProjectAPIEndpoint(BaseAPIView):
request=request,
queryset=(projects),
on_results=lambda projects: ProjectSerializer(
projects,
many=True,
fields=self.fields,
expand=self.expand,
projects, many=True, fields=self.fields, expand=self.expand
).data,
)
project = self.get_queryset().get(workspace__slug=slug, pk=pk)
serializer = ProjectSerializer(
project,
fields=self.fields,
expand=self.expand,
)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug):
@@ -161,14 +146,11 @@ class ProjectAPIEndpoint(BaseAPIView):
# Add the user as Administrator to the project
_ = ProjectMember.objects.create(
project_id=serializer.data["id"],
member=request.user,
role=20,
project_id=serializer.data["id"], member=request.user, role=20
)
# Also create the issue property for the user
_ = IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user=request.user,
project_id=serializer.data["id"], user=request.user
)
if serializer.data["project_lead"] is not None and str(
@@ -236,11 +218,7 @@ class ProjectAPIEndpoint(BaseAPIView):
]
)
project = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
# Model activity
model_activity.delay(
@@ -254,13 +232,8 @@ class ProjectAPIEndpoint(BaseAPIView):
)
serializer = ProjectSerializer(project)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors,
status=status.HTTP_400_BAD_REQUEST,
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
@@ -269,8 +242,7 @@ class ProjectAPIEndpoint(BaseAPIView):
)
except Workspace.DoesNotExist:
return Response(
{"error": "Workspace does not exist"},
status=status.HTTP_404_NOT_FOUND,
{"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
)
except ValidationError:
return Response(
@@ -285,6 +257,9 @@ class ProjectAPIEndpoint(BaseAPIView):
current_instance = json.dumps(
ProjectSerializer(project).data, cls=DjangoJSONEncoder
)
intake_view = request.data.get("inbox_view", project.intake_view)
if project.archived_at:
return Response(
{"error": "Archived project cannot be updated"},
@@ -293,21 +268,20 @@ class ProjectAPIEndpoint(BaseAPIView):
serializer = ProjectSerializer(
project,
data={**request.data},
data={**request.data, "intake_view": intake_view},
context={"workspace_id": workspace.id},
partial=True,
)
if serializer.is_valid():
serializer.save()
if serializer.data["inbox_view"]:
inbox = Inbox.objects.filter(
project=project,
is_default=True,
if serializer.data["intake_view"]:
intake = Intake.objects.filter(
project=project, is_default=True
).first()
if not inbox:
Inbox.objects.create(
name=f"{project.name} Inbox",
if not intake:
Intake.objects.create(
name=f"{project.name} Intake",
project=project,
is_default=True,
)
@@ -316,17 +290,13 @@ class ProjectAPIEndpoint(BaseAPIView):
State.objects.get_or_create(
name="Triage",
group="triage",
description="Default state for managing all Inbox Issues",
description="Default state for managing all Intake Issues",
project_id=pk,
color="#ff7700",
is_triage=True,
)
project = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
model_activity.delay(
model_name="project",
@@ -340,9 +310,7 @@ class ProjectAPIEndpoint(BaseAPIView):
serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
@@ -351,8 +319,7 @@ class ProjectAPIEndpoint(BaseAPIView):
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
{"error": "Project does not exist"},
status=status.HTTP_404_NOT_FOUND,
{"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
)
except ValidationError:
return Response(
@@ -364,28 +331,20 @@ class ProjectAPIEndpoint(BaseAPIView):
project = Project.objects.get(pk=pk, workspace__slug=slug)
# Delete the user favorite cycle
UserFavorite.objects.filter(
entity_type="project",
entity_identifier=pk,
project_id=pk,
entity_type="project", entity_identifier=pk, project_id=pk
).delete()
project.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ProjectArchiveUnarchiveAPIEndpoint(BaseAPIView):
permission_classes = [
ProjectBasePermission,
]
permission_classes = [ProjectBasePermission]
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
project.archived_at = timezone.now()
project.save()
UserFavorite.objects.filter(
workspace__slug=slug,
project=project_id,
).delete()
UserFavorite.objects.filter(workspace__slug=slug, project=project_id).delete()
return Response(status=status.HTTP_204_NO_CONTENT)
def delete(self, request, slug, project_id):

View File

@@ -16,9 +16,7 @@ from .base import BaseAPIView
class StateAPIEndpoint(BaseAPIView):
serializer_class = StateSerializer
model = State
permission_classes = [
ProjectEntityPermission,
]
permission_classes = [ProjectEntityPermission]
def get_queryset(self):
return (
@@ -67,9 +65,7 @@ class StateAPIEndpoint(BaseAPIView):
serializer.save(project_id=project_id)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
except IntegrityError:
state = State.objects.filter(
workspace__slug=slug,
@@ -96,19 +92,13 @@ class StateAPIEndpoint(BaseAPIView):
request=request,
queryset=(self.get_queryset()),
on_results=lambda states: StateSerializer(
states,
many=True,
fields=self.fields,
expand=self.expand,
states, many=True, fields=self.fields, expand=self.expand
).data,
)
def delete(self, request, slug, project_id, state_id):
state = State.objects.get(
is_triage=False,
pk=state_id,
project_id=project_id,
workspace__slug=slug,
is_triage=False, pk=state_id, project_id=project_id, workspace__slug=slug
)
if state.default:
@@ -122,9 +112,7 @@ class StateAPIEndpoint(BaseAPIView):
if issue_exist:
return Response(
{
"error": "The state is not empty, only empty states can be deleted"
},
{"error": "The state is not empty, only empty states can be deleted"},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -25,10 +25,7 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(
Q(expired_at__gt=timezone.now())
| Q(expired_at__isnull=True)
),
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
token=token,
is_active=True,
)

View File

@@ -12,4 +12,4 @@ from .project import (
ProjectMemberPermission,
ProjectLitePermission,
)
from .base import allow_permission, ROLE
from .base import allow_permission, ROLE

View File

@@ -5,6 +5,7 @@ from rest_framework import status
from enum import Enum
class ROLE(Enum):
ADMIN = 20
MEMBER = 15
@@ -15,7 +16,6 @@ def allow_permission(allowed_roles, level="PROJECT", creator=False, model=None):
def decorator(view_func):
@wraps(view_func)
def _wrapped_view(instance, request, *args, **kwargs):
# Check for creator if required
if creator and model:
obj = model.objects.filter(
@@ -26,8 +26,7 @@ def allow_permission(allowed_roles, level="PROJECT", creator=False, model=None):
# Convert allowed_roles to their values if they are enum members
allowed_role_values = [
role.value if isinstance(role, ROLE) else role
for role in allowed_roles
role.value if isinstance(role, ROLE) else role for role in allowed_roles
]
# Check role permissions

View File

@@ -18,9 +18,7 @@ class ProjectBasePermission(BasePermission):
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
workspace__slug=view.workspace_slug, member=request.user, is_active=True
).exists()
## Only workspace owners or admins can create the projects
@@ -50,9 +48,7 @@ class ProjectMemberPermission(BasePermission):
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return ProjectMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
workspace__slug=view.workspace_slug, member=request.user, is_active=True
).exists()
## Only workspace owners or admins can create the projects
if request.method == "POST":

View File

@@ -50,9 +50,7 @@ class WorkspaceOwnerPermission(BasePermission):
return False
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
role=Admin,
workspace__slug=view.workspace_slug, member=request.user, role=Admin
).exists()
@@ -77,9 +75,7 @@ class WorkspaceEntityPermission(BasePermission):
## Safe Methods -> Handle the filtering logic in queryset
if request.method in SAFE_METHODS:
return WorkspaceMember.objects.filter(
workspace__slug=view.workspace_slug,
member=request.user,
is_active=True,
workspace__slug=view.workspace_slug, member=request.user, is_active=True
).exists()
return WorkspaceMember.objects.filter(
@@ -96,9 +92,7 @@ class WorkspaceViewerPermission(BasePermission):
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
member=request.user, workspace__slug=view.workspace_slug, is_active=True
).exists()
@@ -108,7 +102,5 @@ class WorkspaceUserPermission(BasePermission):
return False
return WorkspaceMember.objects.filter(
member=request.user,
workspace__slug=view.workspace_slug,
is_active=True,
member=request.user, workspace__slug=view.workspace_slug, is_active=True
).exists()

View File

@@ -13,7 +13,6 @@ from .user import (
from .workspace import (
WorkSpaceSerializer,
WorkSpaceMemberSerializer,
TeamSerializer,
WorkSpaceMemberInviteSerializer,
WorkspaceLiteSerializer,
WorkspaceThemeSerializer,
@@ -36,9 +35,7 @@ from .project import (
ProjectMemberRoleSerializer,
)
from .state import StateSerializer, StateLiteSerializer
from .view import (
IssueViewSerializer,
)
from .view import IssueViewSerializer
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
@@ -57,7 +54,7 @@ from .issue import (
IssueFlatSerializer,
IssueStateSerializer,
IssueLinkSerializer,
IssueInboxSerializer,
IssueIntakeSerializer,
IssueLiteSerializer,
IssueAttachmentSerializer,
IssueSubscriberSerializer,
@@ -102,20 +99,17 @@ from .estimate import (
WorkspaceEstimateSerializer,
)
from .inbox import (
InboxSerializer,
InboxIssueSerializer,
IssueStateInboxSerializer,
InboxIssueLiteSerializer,
InboxIssueDetailSerializer,
from .intake import (
IntakeSerializer,
IntakeIssueSerializer,
IssueStateIntakeSerializer,
IntakeIssueLiteSerializer,
IntakeIssueDetailSerializer,
)
from .analytic import AnalyticViewSerializer
from .notification import (
NotificationSerializer,
UserNotificationPreferenceSerializer,
)
from .notification import NotificationSerializer, UserNotificationPreferenceSerializer
from .exporter import ExporterHistorySerializer
@@ -124,3 +118,9 @@ from .webhook import WebhookSerializer, WebhookLogSerializer
from .dashboard import DashboardSerializer, WidgetSerializer
from .favorite import UserFavoriteSerializer
from .draft import (
DraftIssueCreateSerializer,
DraftIssueSerializer,
DraftIssueDetailSerializer,
)

View File

@@ -7,10 +7,7 @@ class AnalyticViewSerializer(BaseSerializer):
class Meta:
model = AnalyticView
fields = "__all__"
read_only_fields = [
"workspace",
"query",
]
read_only_fields = ["workspace", "query"]
def create(self, validated_data):
query_params = validated_data.get("query_dict", {})

View File

@@ -6,9 +6,4 @@ class FileAssetSerializer(BaseSerializer):
class Meta:
model = FileAsset
fields = "__all__"
read_only_fields = [
"created_by",
"updated_by",
"created_at",
"updated_at",
]
read_only_fields = ["created_by", "updated_by", "created_at", "updated_at"]

View File

@@ -60,10 +60,10 @@ class DynamicBaseSerializer(BaseSerializer):
CycleIssueSerializer,
IssueLiteSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IntakeIssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
RelatedIssueSerializer,
)
# Expansion mapper
@@ -84,13 +84,14 @@ class DynamicBaseSerializer(BaseSerializer):
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_intake": IntakeIssueLiteSerializer,
"issue_related": RelatedIssueSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
"sub_issues": IssueLiteSerializer,
}
if field not in self.fields and field in expansion:
self.fields[field] = expansion[field](
many=(
True
@@ -101,11 +102,12 @@ class DynamicBaseSerializer(BaseSerializer):
"labels",
"issue_cycle",
"issue_relation",
"issue_inbox",
"issue_intake",
"issue_reactions",
"issue_attachment",
"issue_link",
"sub_issues",
"issue_related",
]
else False
)
@@ -130,11 +132,12 @@ class DynamicBaseSerializer(BaseSerializer):
LabelSerializer,
CycleIssueSerializer,
IssueRelationSerializer,
InboxIssueLiteSerializer,
IntakeIssueLiteSerializer,
IssueLiteSerializer,
IssueReactionLiteSerializer,
IssueAttachmentLiteSerializer,
IssueLinkLiteSerializer,
RelatedIssueSerializer,
)
# Expansion mapper
@@ -155,7 +158,8 @@ class DynamicBaseSerializer(BaseSerializer):
"issue_cycle": CycleIssueSerializer,
"parent": IssueLiteSerializer,
"issue_relation": IssueRelationSerializer,
"issue_inbox": InboxIssueLiteSerializer,
"issue_intake": IntakeIssueLiteSerializer,
"issue_related": RelatedIssueSerializer,
"issue_reactions": IssueReactionLiteSerializer,
"issue_attachment": IssueAttachmentLiteSerializer,
"issue_link": IssueLinkLiteSerializer,
@@ -174,8 +178,26 @@ class DynamicBaseSerializer(BaseSerializer):
response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
response[expand] = getattr(
instance, f"{expand}_id", None
)
response[expand] = getattr(instance, f"{expand}_id", None)
# Check if issue_attachments is in fields or expand
if "issue_attachments" in self.fields or "issue_attachments" in self.expand:
# Import the model here to avoid circular imports
from plane.db.models import FileAsset
issue_id = getattr(instance, "id", None)
if issue_id:
# Fetch related issue_attachments
issue_attachments = FileAsset.objects.filter(
issue_id=issue_id,
entity_type=FileAsset.EntityTypeContext.ISSUE_ATTACHMENT,
)
# Serialize issue_attachments and add them to the response
response["issue_attachments"] = IssueAttachmentLiteSerializer(
issue_attachments, many=True
).data
else:
response["issue_attachments"] = []
return response

View File

@@ -4,11 +4,8 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .issue import IssueStateSerializer
from plane.db.models import (
Cycle,
CycleIssue,
CycleUserProperties,
)
from plane.db.models import Cycle, CycleIssue, CycleUserProperties
from plane.utils.timezone_converter import convert_to_utc
class CycleWriteSerializer(BaseSerializer):
@@ -18,20 +15,24 @@ class CycleWriteSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed end date"
raise serializers.ValidationError("Start date cannot exceed end date")
if (
data.get("start_date", None) is not None
and data.get("end_date", None) is not None
):
project_id = self.initial_data.get("project_id") or self.instance.project_id
data["start_date"] = convert_to_utc(
str(data.get("start_date").date()), project_id, is_start_date=True
)
data["end_date"] = convert_to_utc(
str(data.get("end_date", None).date()), project_id
)
return data
class Meta:
model = Cycle
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"owned_by",
"archived_at",
]
read_only_fields = ["workspace", "project", "owned_by", "archived_at"]
class CycleSerializer(BaseSerializer):
@@ -87,18 +88,11 @@ class CycleIssueSerializer(BaseSerializer):
class Meta:
model = CycleIssue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"cycle",
]
read_only_fields = ["workspace", "project", "cycle"]
class CycleUserPropertiesSerializer(BaseSerializer):
class Meta:
model = CycleUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"cycle" "user",
]
read_only_fields = ["workspace", "project", "cycle" "user"]

View File

@@ -0,0 +1,271 @@
# Django imports
from django.utils import timezone
# Third Party imports
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from plane.db.models import (
User,
Issue,
Label,
State,
DraftIssue,
DraftIssueAssignee,
DraftIssueLabel,
DraftIssueCycle,
DraftIssueModule,
)
class DraftIssueCreateSerializer(BaseSerializer):
# ids
state_id = serializers.PrimaryKeyRelatedField(
source="state", queryset=State.objects.all(), required=False, allow_null=True
)
parent_id = serializers.PrimaryKeyRelatedField(
source="parent", queryset=Issue.objects.all(), required=False, allow_null=True
)
label_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = DraftIssue
fields = "__all__"
read_only_fields = [
"workspace",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def to_representation(self, instance):
data = super().to_representation(instance)
assignee_ids = self.initial_data.get("assignee_ids")
data["assignee_ids"] = assignee_ids if assignee_ids else []
label_ids = self.initial_data.get("label_ids")
data["label_ids"] = label_ids if label_ids else []
return data
def validate(self, data):
if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
return data
def create(self, validated_data):
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
modules = validated_data.pop("module_ids", None)
cycle_id = self.initial_data.get("cycle_id", None)
modules = self.initial_data.get("module_ids", None)
workspace_id = self.context["workspace_id"]
project_id = self.context["project_id"]
# Create Issue
issue = DraftIssue.objects.create(
**validated_data, workspace_id=workspace_id, project_id=project_id
)
# Issue Audit Users
created_by_id = issue.created_by_id
updated_by_id = issue.updated_by_id
if assignees is not None and len(assignees):
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
draft_issue=issue,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None and len(labels):
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
if cycle_id is not None:
DraftIssueCycle.objects.create(
cycle_id=cycle_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if modules is not None and len(modules):
DraftIssueModule.objects.bulk_create(
[
DraftIssueModule(
module_id=module_id,
draft_issue=issue,
project_id=project_id,
workspace_id=workspace_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for module_id in modules
],
batch_size=10,
)
return issue
def update(self, instance, validated_data):
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
cycle_id = self.context.get("cycle_id", None)
modules = self.initial_data.get("module_ids", None)
# Related models
workspace_id = instance.workspace_id
project_id = instance.project_id
created_by_id = instance.created_by_id
updated_by_id = instance.updated_by_id
if assignees is not None:
DraftIssueAssignee.objects.filter(draft_issue=instance).delete()
DraftIssueAssignee.objects.bulk_create(
[
DraftIssueAssignee(
assignee=user,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for user in assignees
],
batch_size=10,
)
if labels is not None:
DraftIssueLabel.objects.filter(draft_issue=instance).delete()
DraftIssueLabel.objects.bulk_create(
[
DraftIssueLabel(
label=label,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for label in labels
],
batch_size=10,
)
if cycle_id != "not_provided":
DraftIssueCycle.objects.filter(draft_issue=instance).delete()
if cycle_id:
DraftIssueCycle.objects.create(
cycle_id=cycle_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
if modules is not None:
DraftIssueModule.objects.filter(draft_issue=instance).delete()
DraftIssueModule.objects.bulk_create(
[
DraftIssueModule(
module_id=module_id,
draft_issue=instance,
workspace_id=workspace_id,
project_id=project_id,
created_by_id=created_by_id,
updated_by_id=updated_by_id,
)
for module_id in modules
],
batch_size=10,
)
# Time updation occurs even when other related models are updated
instance.updated_at = timezone.now()
return super().update(instance, validated_data)
class DraftIssueSerializer(BaseSerializer):
# ids
cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
module_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
# Many to many
label_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
assignee_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
class Meta:
model = DraftIssue
fields = [
"id",
"name",
"state_id",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"project_id",
"parent_id",
"cycle_id",
"module_ids",
"label_ids",
"assignee_ids",
"created_at",
"updated_at",
"created_by",
"updated_by",
"type_id",
"description_html",
]
read_only_fields = fields
class DraftIssueDetailSerializer(DraftIssueSerializer):
description_html = serializers.CharField()
class Meta(DraftIssueSerializer.Meta):
fields = DraftIssueSerializer.Meta.fields + ["description_html"]
read_only_fields = fields

View File

@@ -7,14 +7,10 @@ from rest_framework import serializers
class EstimateSerializer(BaseSerializer):
class Meta:
model = Estimate
fields = "__all__"
read_only_fields = [
"workspace",
"project",
]
read_only_fields = ["workspace", "project"]
class EstimatePointSerializer(BaseSerializer):
@@ -23,19 +19,13 @@ class EstimatePointSerializer(BaseSerializer):
raise serializers.ValidationError("Estimate points are required")
value = data.get("value")
if value and len(value) > 20:
raise serializers.ValidationError(
"Value can't be more than 20 characters"
)
raise serializers.ValidationError("Value can't be more than 20 characters")
return data
class Meta:
model = EstimatePoint
fields = "__all__"
read_only_fields = [
"estimate",
"workspace",
"project",
]
read_only_fields = ["estimate", "workspace", "project"]
class EstimateReadSerializer(BaseSerializer):
@@ -44,11 +34,7 @@ class EstimateReadSerializer(BaseSerializer):
class Meta:
model = Estimate
fields = "__all__"
read_only_fields = [
"points",
"name",
"description",
]
read_only_fields = ["points", "name", "description"]
class WorkspaceEstimateSerializer(BaseSerializer):
@@ -57,8 +43,4 @@ class WorkspaceEstimateSerializer(BaseSerializer):
class Meta:
model = Estimate
fields = "__all__"
read_only_fields = [
"points",
"name",
"description",
]
read_only_fields = ["points", "name", "description"]

View File

@@ -5,9 +5,7 @@ from .user import UserLiteSerializer
class ExporterHistorySerializer(BaseSerializer):
initiated_by_detail = UserLiteSerializer(
source="initiated_by", read_only=True
)
initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
class Meta:
model = ExporterHistory

View File

@@ -1,18 +1,9 @@
from rest_framework import serializers
from plane.db.models import (
UserFavorite,
Cycle,
Module,
Issue,
IssueView,
Page,
Project,
)
from plane.db.models import UserFavorite, Cycle, Module, Issue, IssueView, Page, Project
class ProjectFavoriteLiteSerializer(serializers.ModelSerializer):
class Meta:
model = Project
fields = ["id", "name", "logo_props"]
@@ -33,21 +24,18 @@ class PageFavoriteLiteSerializer(serializers.ModelSerializer):
class CycleFavoriteLiteSerializer(serializers.ModelSerializer):
class Meta:
model = Cycle
fields = ["id", "name", "logo_props", "project_id"]
class ModuleFavoriteLiteSerializer(serializers.ModelSerializer):
class Meta:
model = Module
fields = ["id", "name", "logo_props", "project_id"]
class ViewFavoriteSerializer(serializers.ModelSerializer):
class Meta:
model = IssueView
fields = ["id", "name", "logo_props", "project_id"]
@@ -89,9 +77,7 @@ class UserFavoriteSerializer(serializers.ModelSerializer):
entity_type = obj.entity_type
entity_identifier = obj.entity_identifier
entity_model, entity_serializer = get_entity_model_and_serializer(
entity_type
)
entity_model, entity_serializer = get_entity_model_and_serializer(entity_type)
if entity_model and entity_serializer:
try:
entity = entity_model.objects.get(pk=entity_identifier)

View File

@@ -7,13 +7,9 @@ from plane.db.models import Importer
class ImporterSerializer(BaseSerializer):
initiated_by_detail = UserLiteSerializer(
source="initiated_by", read_only=True
)
initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
class Meta:
model = Importer

View File

@@ -3,35 +3,28 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .issue import (
IssueInboxSerializer,
LabelLiteSerializer,
IssueDetailSerializer,
)
from .issue import IssueIntakeSerializer, LabelLiteSerializer, IssueDetailSerializer
from .project import ProjectLiteSerializer
from .state import StateLiteSerializer
from .user import UserLiteSerializer
from plane.db.models import Inbox, InboxIssue, Issue
from plane.db.models import Intake, IntakeIssue, Issue
class InboxSerializer(BaseSerializer):
class IntakeSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(source="project", read_only=True)
pending_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = Inbox
model = Intake
fields = "__all__"
read_only_fields = [
"project",
"workspace",
]
read_only_fields = ["project", "workspace"]
class InboxIssueSerializer(BaseSerializer):
issue = IssueInboxSerializer(read_only=True)
class IntakeIssueSerializer(BaseSerializer):
issue = IssueIntakeSerializer(read_only=True)
class Meta:
model = InboxIssue
model = IntakeIssue
fields = [
"id",
"status",
@@ -41,10 +34,7 @@ class InboxIssueSerializer(BaseSerializer):
"issue",
"created_by",
]
read_only_fields = [
"project",
"workspace",
]
read_only_fields = ["project", "workspace"]
def to_representation(self, instance):
# Pass the annotated fields to the Issue instance if they exist
@@ -53,14 +43,14 @@ class InboxIssueSerializer(BaseSerializer):
return super().to_representation(instance)
class InboxIssueDetailSerializer(BaseSerializer):
class IntakeIssueDetailSerializer(BaseSerializer):
issue = IssueDetailSerializer(read_only=True)
duplicate_issue_detail = IssueInboxSerializer(
duplicate_issue_detail = IssueIntakeSerializer(
read_only=True, source="duplicate_to"
)
class Meta:
model = InboxIssue
model = IntakeIssue
fields = [
"id",
"status",
@@ -70,10 +60,7 @@ class InboxIssueDetailSerializer(BaseSerializer):
"source",
"issue",
]
read_only_fields = [
"project",
"workspace",
]
read_only_fields = ["project", "workspace"]
def to_representation(self, instance):
# Pass the annotated fields to the Issue instance if they exist
@@ -85,24 +72,20 @@ class InboxIssueDetailSerializer(BaseSerializer):
return super().to_representation(instance)
class InboxIssueLiteSerializer(BaseSerializer):
class IntakeIssueLiteSerializer(BaseSerializer):
class Meta:
model = InboxIssue
model = IntakeIssue
fields = ["id", "status", "duplicate_to", "snoozed_till", "source"]
read_only_fields = fields
class IssueStateInboxSerializer(BaseSerializer):
class IssueStateIntakeSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
label_details = LabelLiteSerializer(
read_only=True, source="labels", many=True
)
assignee_details = UserLiteSerializer(
read_only=True, source="assignees", many=True
)
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
issue_intake = IntakeIssueLiteSerializer(read_only=True, many=True)
class Meta:
model = Issue

View File

@@ -27,7 +27,7 @@ from plane.db.models import (
Module,
ModuleIssue,
IssueLink,
IssueAttachment,
FileAsset,
IssueReaction,
CommentReaction,
IssueVote,
@@ -60,12 +60,7 @@ class IssueProjectLiteSerializer(BaseSerializer):
class Meta:
model = Issue
fields = [
"id",
"project_detail",
"name",
"sequence_id",
]
fields = ["id", "project_detail", "name", "sequence_id"]
read_only_fields = fields
@@ -74,16 +69,10 @@ class IssueProjectLiteSerializer(BaseSerializer):
class IssueCreateSerializer(BaseSerializer):
# ids
state_id = serializers.PrimaryKeyRelatedField(
source="state",
queryset=State.objects.all(),
required=False,
allow_null=True,
source="state", queryset=State.objects.all(), required=False, allow_null=True
)
parent_id = serializers.PrimaryKeyRelatedField(
source="parent",
queryset=Issue.objects.all(),
required=False,
allow_null=True,
source="parent", queryset=Issue.objects.all(), required=False, allow_null=True
)
label_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
@@ -95,6 +84,8 @@ class IssueCreateSerializer(BaseSerializer):
write_only=True,
required=False,
)
project_id = serializers.UUIDField(source="project.id", read_only=True)
workspace_id = serializers.UUIDField(source="workspace.id", read_only=True)
class Meta:
model = Issue
@@ -122,9 +113,7 @@ class IssueCreateSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
raise serializers.ValidationError("Start date cannot exceed target date")
return data
def create(self, validated_data):
@@ -136,10 +125,7 @@ class IssueCreateSerializer(BaseSerializer):
default_assignee_id = self.context["default_assignee_id"]
# Create Issue
issue = Issue.objects.create(
**validated_data,
project_id=project_id,
)
issue = Issue.objects.create(**validated_data, project_id=project_id)
# Issue Audit Users
created_by_id = issue.created_by_id
@@ -243,9 +229,7 @@ class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = IssueActivity
@@ -256,11 +240,7 @@ class IssueUserPropertySerializer(BaseSerializer):
class Meta:
model = IssueUserProperty
fields = "__all__"
read_only_fields = [
"user",
"workspace",
"project",
]
read_only_fields = ["user", "workspace", "project"]
class LabelSerializer(BaseSerializer):
@@ -275,30 +255,20 @@ class LabelSerializer(BaseSerializer):
"workspace_id",
"sort_order",
]
read_only_fields = [
"workspace",
"project",
]
read_only_fields = ["workspace", "project"]
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [
"id",
"name",
"color",
]
fields = ["id", "name", "color"]
class IssueLabelSerializer(BaseSerializer):
class Meta:
model = IssueLabel
fields = "__all__"
read_only_fields = [
"workspace",
"project",
]
read_only_fields = ["workspace", "project"]
class IssueRelationSerializer(BaseSerializer):
@@ -314,17 +284,8 @@ class IssueRelationSerializer(BaseSerializer):
class Meta:
model = IssueRelation
fields = [
"id",
"project_id",
"sequence_id",
"relation_type",
"name",
]
read_only_fields = [
"workspace",
"project",
]
fields = ["id", "project_id", "sequence_id", "relation_type", "name"]
read_only_fields = ["workspace", "project"]
class RelatedIssueSerializer(BaseSerializer):
@@ -332,25 +293,14 @@ class RelatedIssueSerializer(BaseSerializer):
project_id = serializers.PrimaryKeyRelatedField(
source="issue.project_id", read_only=True
)
sequence_id = serializers.IntegerField(
source="issue.sequence_id", read_only=True
)
sequence_id = serializers.IntegerField(source="issue.sequence_id", read_only=True)
name = serializers.CharField(source="issue.name", read_only=True)
relation_type = serializers.CharField(read_only=True)
class Meta:
model = IssueRelation
fields = [
"id",
"project_id",
"sequence_id",
"relation_type",
"name",
]
read_only_fields = [
"workspace",
"project",
]
fields = ["id", "project_id", "sequence_id", "relation_type", "name"]
read_only_fields = ["workspace", "project"]
class IssueAssigneeSerializer(BaseSerializer):
@@ -458,8 +408,7 @@ class IssueLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=validated_data.get("issue_id"),
url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -469,8 +418,7 @@ class IssueLinkSerializer(BaseSerializer):
def update(self, instance, validated_data):
if (
IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
url=validated_data.get("url"), issue_id=instance.issue_id
)
.exclude(pk=instance.id)
.exists()
@@ -498,8 +446,10 @@ class IssueLinkLiteSerializer(BaseSerializer):
class IssueAttachmentSerializer(BaseSerializer):
asset_url = serializers.CharField(read_only=True)
class Meta:
model = IssueAttachment
model = FileAsset
fields = "__all__"
read_only_fields = [
"created_by",
@@ -514,14 +464,15 @@ class IssueAttachmentSerializer(BaseSerializer):
class IssueAttachmentLiteSerializer(DynamicBaseSerializer):
class Meta:
model = IssueAttachment
model = FileAsset
fields = [
"id",
"asset",
"attributes",
"issue_id",
# "issue_id",
"updated_at",
"updated_by",
"asset_url",
]
read_only_fields = fields
@@ -532,37 +483,20 @@ class IssueReactionSerializer(BaseSerializer):
class Meta:
model = IssueReaction
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"actor",
"deleted_at",
]
read_only_fields = ["workspace", "project", "issue", "actor", "deleted_at"]
class IssueReactionLiteSerializer(DynamicBaseSerializer):
class Meta:
model = IssueReaction
fields = [
"id",
"actor",
"issue",
"reaction",
]
fields = ["id", "actor", "issue", "reaction"]
class CommentReactionSerializer(BaseSerializer):
class Meta:
model = CommentReaction
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"comment",
"actor",
"deleted_at",
]
read_only_fields = ["workspace", "project", "comment", "actor", "deleted_at"]
class IssueVoteSerializer(BaseSerializer):
@@ -570,14 +504,7 @@ class IssueVoteSerializer(BaseSerializer):
class Meta:
model = IssueVote
fields = [
"issue",
"vote",
"workspace",
"project",
"actor",
"actor_detail",
]
fields = ["issue", "vote", "workspace", "project", "actor", "actor_detail"]
read_only_fields = fields
@@ -585,9 +512,7 @@ class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
comment_reactions = CommentReactionSerializer(read_only=True, many=True)
is_member = serializers.BooleanField(read_only=True)
@@ -611,25 +536,15 @@ class IssueStateFlatSerializer(BaseSerializer):
class Meta:
model = Issue
fields = [
"id",
"sequence_id",
"name",
"state_detail",
"project_detail",
]
fields = ["id", "sequence_id", "name", "state_detail", "project_detail"]
# Issue Serializer with state details
class IssueStateSerializer(DynamicBaseSerializer):
label_details = LabelLiteSerializer(
read_only=True, source="labels", many=True
)
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
assignee_details = UserLiteSerializer(
read_only=True, source="assignees", many=True
)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
sub_issues_count = serializers.IntegerField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
@@ -639,11 +554,8 @@ class IssueStateSerializer(DynamicBaseSerializer):
fields = "__all__"
class IssueInboxSerializer(DynamicBaseSerializer):
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
class IssueIntakeSerializer(DynamicBaseSerializer):
label_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
class Meta:
model = Issue
@@ -663,20 +575,11 @@ class IssueInboxSerializer(DynamicBaseSerializer):
class IssueSerializer(DynamicBaseSerializer):
# ids
cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
module_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
module_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
# Many to many
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
label_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
assignee_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
# Count items
sub_issues_count = serializers.IntegerField(read_only=True)
@@ -718,11 +621,7 @@ class IssueSerializer(DynamicBaseSerializer):
class IssueLiteSerializer(DynamicBaseSerializer):
class Meta:
model = Issue
fields = [
"id",
"sequence_id",
"project_id",
]
fields = ["id", "sequence_id", "project_id"]
read_only_fields = fields
@@ -731,10 +630,7 @@ class IssueDetailSerializer(IssueSerializer):
is_subscribed = serializers.BooleanField(read_only=True)
class Meta(IssueSerializer.Meta):
fields = IssueSerializer.Meta.fields + [
"description_html",
"is_subscribed",
]
fields = IssueSerializer.Meta.fields + ["description_html", "is_subscribed"]
read_only_fields = fields
@@ -770,8 +666,4 @@ class IssueSubscriberSerializer(BaseSerializer):
class Meta:
model = IssueSubscriber
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
]
read_only_fields = ["workspace", "project", "issue"]

View File

@@ -21,10 +21,7 @@ from plane.db.models import (
class ModuleWriteSerializer(BaseSerializer):
lead_id = serializers.PrimaryKeyRelatedField(
source="lead",
queryset=User.objects.all(),
required=False,
allow_null=True,
source="lead", queryset=User.objects.all(), required=False, allow_null=True
)
member_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
@@ -48,9 +45,7 @@ class ModuleWriteSerializer(BaseSerializer):
def to_representation(self, instance):
data = super().to_representation(instance)
data["member_ids"] = [
str(member.id) for member in instance.members.all()
]
data["member_ids"] = [str(member.id) for member in instance.members.all()]
return data
def validate(self, data):
@@ -59,9 +54,7 @@ class ModuleWriteSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
raise serializers.ValidationError("Start date cannot exceed target date")
return data
def create(self, validated_data):
@@ -71,9 +64,7 @@ class ModuleWriteSerializer(BaseSerializer):
module_name = validated_data.get("name")
if module_name:
# Lookup for the module name in the module table for that project
if Module.objects.filter(
name=module_name, project=project
).exists():
if Module.objects.filter(name=module_name, project=project).exists():
raise serializers.ValidationError(
{"error": "Module with this name already exists"}
)
@@ -104,9 +95,7 @@ class ModuleWriteSerializer(BaseSerializer):
if module_name:
# Lookup for the module name in the module table for that project
if (
Module.objects.filter(
name=module_name, project=instance.project
)
Module.objects.filter(name=module_name, project=instance.project)
.exclude(id=instance.id)
.exists()
):
@@ -203,8 +192,7 @@ class ModuleLinkSerializer(BaseSerializer):
def create(self, validated_data):
validated_data["url"] = self.validate_url(validated_data.get("url"))
if ModuleLink.objects.filter(
url=validated_data.get("url"),
module_id=validated_data.get("module_id"),
url=validated_data.get("url"), module_id=validated_data.get("module_id")
).exists():
raise serializers.ValidationError({"error": "URL already exists."})
return super().create(validated_data)
@@ -213,8 +201,7 @@ class ModuleLinkSerializer(BaseSerializer):
validated_data["url"] = self.validate_url(validated_data.get("url"))
if (
ModuleLink.objects.filter(
url=validated_data.get("url"),
module_id=instance.module_id,
url=validated_data.get("url"), module_id=instance.module_id
)
.exclude(pk=instance.id)
.exists()

View File

@@ -8,10 +8,9 @@ from rest_framework import serializers
class NotificationSerializer(BaseSerializer):
triggered_by_details = UserLiteSerializer(
read_only=True, source="triggered_by"
)
triggered_by_details = UserLiteSerializer(read_only=True, source="triggered_by")
is_inbox_issue = serializers.BooleanField(read_only=True)
is_intake_issue = serializers.BooleanField(read_only=True)
is_mentioned_notification = serializers.BooleanField(read_only=True)
class Meta:

View File

@@ -22,14 +22,8 @@ class PageSerializer(BaseSerializer):
required=False,
)
# Many to many
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
project_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
label_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
project_ids = serializers.ListField(child=serializers.UUIDField(), required=False)
class Meta:
model = Page
@@ -54,10 +48,7 @@ class PageSerializer(BaseSerializer):
"label_ids",
"project_ids",
]
read_only_fields = [
"workspace",
"owned_by",
]
read_only_fields = ["workspace", "owned_by"]
def create(self, validated_data):
labels = validated_data.pop("labels", None)
@@ -127,9 +118,7 @@ class PageDetailSerializer(PageSerializer):
description_html = serializers.CharField()
class Meta(PageSerializer.Meta):
fields = PageSerializer.Meta.fields + [
"description_html",
]
fields = PageSerializer.Meta.fields + ["description_html"]
class SubPageSerializer(BaseSerializer):
@@ -138,10 +127,7 @@ class SubPageSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"page",
]
read_only_fields = ["workspace", "page"]
def get_entity_details(self, obj):
entity_name = obj.entity_name
@@ -158,10 +144,7 @@ class PageLogSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"
read_only_fields = [
"workspace",
"page",
]
read_only_fields = ["workspace", "page"]
class PageVersionSerializer(BaseSerializer):
@@ -178,10 +161,7 @@ class PageVersionSerializer(BaseSerializer):
"created_by",
"updated_by",
]
read_only_fields = [
"workspace",
"page",
]
read_only_fields = ["workspace", "page"]
class PageVersionDetailSerializer(BaseSerializer):
@@ -201,7 +181,4 @@ class PageVersionDetailSerializer(BaseSerializer):
"created_by",
"updated_by",
]
read_only_fields = [
"workspace",
"page",
]
read_only_fields = ["workspace", "page"]

View File

@@ -4,10 +4,7 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from plane.app.serializers.workspace import WorkspaceLiteSerializer
from plane.app.serializers.user import (
UserLiteSerializer,
UserAdminLiteSerializer,
)
from plane.app.serializers.user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
Project,
ProjectMember,
@@ -19,31 +16,23 @@ from plane.db.models import (
class ProjectSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
class Meta:
model = Project
fields = "__all__"
read_only_fields = [
"workspace",
"deleted_at",
]
read_only_fields = ["workspace", "deleted_at"]
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(
detail="Project Identifier is required"
)
raise serializers.ValidationError(detail="Project Identifier is required")
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(
detail="Project Identifier is taken"
)
raise serializers.ValidationError(detail="Project Identifier is taken")
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
)
@@ -82,9 +71,7 @@ class ProjectSerializer(BaseSerializer):
return project
# If not same fail update
raise serializers.ValidationError(
detail="Project Identifier is already taken"
)
raise serializers.ValidationError(detail="Project Identifier is already taken")
class ProjectLiteSerializer(BaseSerializer):
@@ -95,6 +82,7 @@ class ProjectLiteSerializer(BaseSerializer):
"identifier",
"name",
"cover_image",
"cover_image_url",
"logo_props",
"description",
]
@@ -117,6 +105,8 @@ class ProjectListSerializer(DynamicBaseSerializer):
member_role = serializers.IntegerField(read_only=True)
anchor = serializers.CharField(read_only=True)
members = serializers.SerializerMethodField()
cover_image_url = serializers.CharField(read_only=True)
inbox_view = serializers.BooleanField(read_only=True, source="intake_view")
def get_members(self, obj):
project_members = getattr(obj, "members_list", None)
@@ -128,6 +118,7 @@ class ProjectListSerializer(DynamicBaseSerializer):
"member_id": member.member_id,
"member__display_name": member.member.display_name,
"member__avatar": member.member.avatar,
"member__avatar_url": member.member.avatar_url,
}
for member in project_members
]
@@ -209,26 +200,16 @@ class ProjectMemberLiteSerializer(BaseSerializer):
class DeployBoardSerializer(BaseSerializer):
project_details = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
class Meta:
model = DeployBoard
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"anchor",
]
read_only_fields = ["workspace", "project", "anchor"]
class ProjectPublicMemberSerializer(BaseSerializer):
class Meta:
model = ProjectPublicMember
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"member",
]
read_only_fields = ["workspace", "project", "member"]

View File

@@ -19,19 +19,11 @@ class StateSerializer(BaseSerializer):
"description",
"sequence",
]
read_only_fields = [
"workspace",
"project",
]
read_only_fields = ["workspace", "project"]
class StateLiteSerializer(BaseSerializer):
class Meta:
model = State
fields = [
"id",
"name",
"color",
"group",
]
fields = ["id", "name", "color", "group"]
read_only_fields = fields

View File

@@ -2,13 +2,7 @@
from rest_framework import serializers
# Module import
from plane.db.models import (
Account,
Profile,
User,
Workspace,
WorkspaceMemberInvite,
)
from plane.db.models import Account, Profile, User, Workspace, WorkspaceMemberInvite
from .base import BaseSerializer
@@ -17,11 +11,7 @@ class UserSerializer(BaseSerializer):
class Meta:
model = User
# Exclude password field from the serializer
fields = [
field.name
for field in User._meta.fields
if field.name != "password"
]
fields = [field.name for field in User._meta.fields if field.name != "password"]
# Make all system fields and email read only
read_only_fields = [
"id",
@@ -62,6 +52,8 @@ class UserMeSerializer(BaseSerializer):
"id",
"avatar",
"cover_image",
"avatar_url",
"cover_image_url",
"date_joined",
"display_name",
"email",
@@ -84,11 +76,7 @@ class UserMeSettingsSerializer(BaseSerializer):
class Meta:
model = User
fields = [
"id",
"email",
"workspace",
]
fields = ["id", "email", "workspace"]
read_only_fields = fields
def get_workspace(self, obj):
@@ -125,8 +113,7 @@ class UserMeSettingsSerializer(BaseSerializer):
else:
fallback_workspace = (
Workspace.objects.filter(
workspace_member__member_id=obj.id,
workspace_member__is_active=True,
workspace_member__member_id=obj.id, workspace_member__is_active=True
)
.order_by("created_at")
.first()
@@ -135,14 +122,10 @@ class UserMeSettingsSerializer(BaseSerializer):
"last_workspace_id": None,
"last_workspace_slug": None,
"fallback_workspace_id": (
fallback_workspace.id
if fallback_workspace is not None
else None
fallback_workspace.id if fallback_workspace is not None else None
),
"fallback_workspace_slug": (
fallback_workspace.slug
if fallback_workspace is not None
else None
fallback_workspace.slug if fallback_workspace is not None else None
),
"invites": workspace_invites,
}
@@ -156,13 +139,11 @@ class UserLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"avatar_url",
"is_bot",
"display_name",
]
read_only_fields = [
"id",
"is_bot",
]
read_only_fields = ["id", "is_bot"]
class UserAdminLiteSerializer(BaseSerializer):
@@ -173,15 +154,13 @@ class UserAdminLiteSerializer(BaseSerializer):
"first_name",
"last_name",
"avatar",
"avatar_url",
"is_bot",
"display_name",
"email",
"last_login_medium",
]
read_only_fields = [
"id",
"is_bot",
]
read_only_fields = ["id", "is_bot"]
class ChangePasswordSerializer(serializers.Serializer):
@@ -202,9 +181,7 @@ class ChangePasswordSerializer(serializers.Serializer):
if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError(
{
"error": "Confirm password should be same as the new password."
}
{"error": "Confirm password should be same as the new password."}
)
return data
@@ -222,15 +199,11 @@ class ProfileSerializer(BaseSerializer):
class Meta:
model = Profile
fields = "__all__"
read_only_fields = [
"user",
]
read_only_fields = ["user"]
class AccountSerializer(BaseSerializer):
class Meta:
model = Account
fields = "__all__"
read_only_fields = [
"user",
]
read_only_fields = ["user"]

View File

@@ -47,13 +47,9 @@ class WebhookSerializer(DynamicBaseSerializer):
# Additional validation for multiple request domains and their subdomains
request = self.context.get("request")
disallowed_domains = [
"plane.so",
] # Add your disallowed domains here
disallowed_domains = ["plane.so"] # Add your disallowed domains here
if request:
request_host = request.get_host().split(":")[
0
] # Remove port if present
request_host = request.get_host().split(":")[0] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
@@ -99,9 +95,7 @@ class WebhookSerializer(DynamicBaseSerializer):
# Additional validation for multiple request domains and their subdomains
request = self.context.get("request")
disallowed_domains = [
"plane.so",
] # Add your disallowed domains here
disallowed_domains = ["plane.so"] # Add your disallowed domains here
if request:
request_host = request.get_host().split(":")[
0
@@ -122,10 +116,7 @@ class WebhookSerializer(DynamicBaseSerializer):
class Meta:
model = Webhook
fields = "__all__"
read_only_fields = [
"workspace",
"secret_key",
]
read_only_fields = ["workspace", "secret_key"]
class WebhookLogSerializer(DynamicBaseSerializer):

View File

@@ -6,11 +6,8 @@ from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
User,
Workspace,
WorkspaceMember,
Team,
TeamMember,
WorkspaceMemberInvite,
WorkspaceTheme,
WorkspaceUserProperties,
@@ -22,6 +19,7 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
logo_url = serializers.CharField(read_only=True)
def validate_slug(self, value):
# Check if the slug is restricted
@@ -39,17 +37,14 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
"created_at",
"updated_at",
"owner",
"logo_url",
]
class WorkspaceLiteSerializer(BaseSerializer):
class Meta:
model = Workspace
fields = [
"name",
"slug",
"id",
]
fields = ["name", "slug", "id"]
read_only_fields = fields
@@ -63,6 +58,8 @@ class WorkSpaceMemberSerializer(DynamicBaseSerializer):
class WorkspaceMemberMeSerializer(BaseSerializer):
draft_issue_count = serializers.IntegerField(read_only=True)
class Meta:
model = WorkspaceMember
fields = "__all__"
@@ -97,71 +94,15 @@ class WorkSpaceMemberInviteSerializer(BaseSerializer):
]
class TeamSerializer(BaseSerializer):
members_detail = UserLiteSerializer(
read_only=True, source="members", many=True
)
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
class Meta:
model = Team
fields = "__all__"
read_only_fields = [
"workspace",
"created_by",
"updated_by",
"created_at",
"updated_at",
]
def create(self, validated_data, **kwargs):
if "members" in validated_data:
members = validated_data.pop("members")
workspace = self.context["workspace"]
team = Team.objects.create(**validated_data, workspace=workspace)
team_members = [
TeamMember(member=member, team=team, workspace=workspace)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return team
team = Team.objects.create(**validated_data)
return team
def update(self, instance, validated_data):
if "members" in validated_data:
members = validated_data.pop("members")
TeamMember.objects.filter(team=instance).delete()
team_members = [
TeamMember(
member=member, team=instance, workspace=instance.workspace
)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
return super().update(instance, validated_data)
return super().update(instance, validated_data)
class WorkspaceThemeSerializer(BaseSerializer):
class Meta:
model = WorkspaceTheme
fields = "__all__"
read_only_fields = [
"workspace",
"actor",
]
read_only_fields = ["workspace", "actor"]
class WorkspaceUserPropertiesSerializer(BaseSerializer):
class Meta:
model = WorkspaceUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"user",
]
read_only_fields = ["workspace", "user"]

View File

@@ -5,7 +5,7 @@ from .cycle import urlpatterns as cycle_urls
from .dashboard import urlpatterns as dashboard_urls
from .estimate import urlpatterns as estimate_urls
from .external import urlpatterns as external_urls
from .inbox import urlpatterns as inbox_urls
from .intake import urlpatterns as intake_urls
from .issue import urlpatterns as issue_urls
from .module import urlpatterns as module_urls
from .notification import urlpatterns as notification_urls
@@ -25,7 +25,7 @@ urlpatterns = [
*dashboard_urls,
*estimate_urls,
*external_urls,
*inbox_urls,
*intake_urls,
*issue_urls,
*module_urls,
*notification_urls,

View File

@@ -5,6 +5,13 @@ from plane.app.views import (
FileAssetEndpoint,
UserAssetsEndpoint,
FileAssetViewSet,
# V2 Endpoints
WorkspaceFileAssetEndpoint,
UserAssetsV2Endpoint,
StaticFileAssetEndpoint,
AssetRestoreEndpoint,
ProjectAssetEndpoint,
ProjectBulkAssetEndpoint,
)
@@ -19,11 +26,7 @@ urlpatterns = [
FileAssetEndpoint.as_view(),
name="file-assets",
),
path(
"users/file-assets/",
UserAssetsEndpoint.as_view(),
name="user-file-assets",
),
path("users/file-assets/", UserAssetsEndpoint.as_view(), name="user-file-assets"),
path(
"users/file-assets/<str:asset_key>/",
UserAssetsEndpoint.as_view(),
@@ -31,11 +34,52 @@ urlpatterns = [
),
path(
"workspaces/file-assets/<uuid:workspace_id>/<str:asset_key>/restore/",
FileAssetViewSet.as_view(
{
"post": "restore",
}
),
FileAssetViewSet.as_view({"post": "restore"}),
name="file-assets-restore",
),
# V2 Endpoints
path(
"assets/v2/workspaces/<str:slug>/",
WorkspaceFileAssetEndpoint.as_view(),
name="workspace-file-assets",
),
path(
"assets/v2/workspaces/<str:slug>/<uuid:asset_id>/",
WorkspaceFileAssetEndpoint.as_view(),
name="workspace-file-assets",
),
path(
"assets/v2/user-assets/",
UserAssetsV2Endpoint.as_view(),
name="user-file-assets",
),
path(
"assets/v2/user-assets/<uuid:asset_id>/",
UserAssetsV2Endpoint.as_view(),
name="user-file-assets",
),
path(
"assets/v2/workspaces/<str:slug>/restore/<uuid:asset_id>/",
AssetRestoreEndpoint.as_view(),
name="asset-restore",
),
path(
"assets/v2/static/<uuid:asset_id>/",
StaticFileAssetEndpoint.as_view(),
name="static-file-asset",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/",
ProjectAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:pk>/",
ProjectAssetEndpoint.as_view(),
name="bulk-asset-update",
),
path(
"assets/v2/workspaces/<str:slug>/projects/<uuid:project_id>/<uuid:entity_id>/bulk/",
ProjectBulkAssetEndpoint.as_view(),
),
]

View File

@@ -17,12 +17,7 @@ from plane.app.views import (
urlpatterns = [
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
CycleViewSet.as_view({"get": "list", "post": "create"}),
name="project-cycle",
),
path(
@@ -39,12 +34,7 @@ urlpatterns = [
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/",
CycleIssueViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
CycleIssueViewSet.as_view({"get": "list", "post": "create"}),
name="project-issue-cycle",
),
path(
@@ -66,21 +56,12 @@ urlpatterns = [
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-cycles/",
CycleFavoriteViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
CycleFavoriteViewSet.as_view({"get": "list", "post": "create"}),
name="user-favorite-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-favorite-cycles/<uuid:cycle_id>/",
CycleFavoriteViewSet.as_view(
{
"delete": "destroy",
}
),
CycleFavoriteViewSet.as_view({"delete": "destroy"}),
name="user-favorite-cycle",
),
path(

Some files were not shown because too many files have changed in this diff Show More