Compare commits

...

65 Commits

Author SHA1 Message Date
Palanikannan1437
3865bbdd30 feat: Add support for hardBreak extension
This commit adds support for the hardBreak extension in the `CoreEditorExtensions` component located at `packages/editor/core/src/ui/extensions/index.tsx`. The extension now includes an HTMLAttributes object with a class of "p-2" for styling purposes.
2024-01-19 16:34:28 +05:30
sriram veeraghanta
e54b940f86 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-19 16:17:21 +05:30
Prateek Shourya
e4084fe156 refactor: move all sidebar links to constant file. (#3414)
* chore: move all workspace, project and profile links constants into their own constants file.

* chore: sidebar menu links improvement.
2024-01-19 16:05:52 +05:30
Henit Chobisa
e975abff21 Improvement: High Performance MobX Integration for Pages ✈︎ (#3397)
* fix: removed parameters `workspace`, `project` & `id` from the patch calls

* feat: modified components to work with new pages hooks

* feat: modified stores

* feat: modified initial component

* feat: component implementation changes

* feat: store implementation

* refactor pages store

* feat: updated page store to perform async operations faster

* fix: added types for archive and restore pages

* feat: implemented archive and restore pages

* fix: page creating twice when form submit

* feat: updated create-page-modal

* feat: updated page form and delete page modal

* fix: create page modal not updating isSubmitted prop

* feat: list items and list view refactored for pages

* feat: refactored project-page-store for inserting computed pagesids

* chore: renamed project pages hook

* feat: added favourite pages implementation

* fix: implemented store for archived pages

* fix: project page store for recent pages

* fix: issue suggestions breaking pages

* fix: issue embeds and suggestions breaking

* feat: implemented page store and project page store in page editor

* chore: lock file changes

* fix: modified page details header to catch mobx updates instead of swr calls

* fix: modified usePage hook to fetch page details when reloaded directly on page

* fix: fixed deleting pages

* fix: removed render on props changed

* feat: implemented page store inside page details

* fix: role change in pages archives

* fix: rerending of pages on tab change

* fix: reimplementation of peek overview inside pages

* chore: typo fixes

* fix: issue suggestion widget selecting wrong issues on click

* feat: added labels in pages

* fix: deepsource errors fixed

* fix: build errors

* fix: review comments

* fix: removed swr hooks from the `usePage` store hook and refactored `issueEmbed` hook

* fix: resolved reviewed comments

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-19 15:18:47 +05:30
Bavisetti Narayan
f68e6023c3 fix: user profile issues (#3409) 2024-01-19 15:17:49 +05:30
rahulramesha
67414983da fix: update global issues filter and enable profile issues (#3410)
* fix all issues and fix profile issues

* minor comments update

* minor change nullish check logic

* update nullish check logic

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-19 15:06:19 +05:30
Anmol Singh Bhatia
37ddc64b83 chore: issue filter loader improvement (#3406) 2024-01-18 17:26:13 +05:30
sriram veeraghanta
57c25c9a5a Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-18 16:05:31 +05:30
rahulramesha
c593d5df1b fix: enable global/ all issues (#3405)
* fix global issues and views

* remove separate layouts for specific views

* add permissions to views

* fix global issues filters

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-18 15:51:17 +05:30
Bavisetti Narayan
9065b5d368 feat: dashboard widgets (#3362)
* fix: created dashboard, widgets and dashboard widget model

* fix: new user home dashboard

* chore: recent projects list

* chore: recent collaborators

* chore: priority order change

* chore: payload changes

* chore: collaborator's active issue count

* chore: all dashboard widgets added with services and typs

* chore: centered metric for pie chart

* chore: widget filters

* chore: created issue filter

* fix: created and assigned issues payload change

* chore: created issue payload change

* fix: date filter change

* chore: implement filters

* fix: added expansion fields

* fix: changed issue structure with relation

* chore: new issues response

* fix: project member fix

* chore: updated issue_relation structure

* chore: code cleanup

* chore: update issues response and added empty states

* fix: button text wrap

* chore: update empty state messages

* fix: filters

* chore: update dark mode empty states

* build-error: Type check in the issue relation service

* fix: issues redirection

* fix: project empty state

* chore: project member active check

* chore: project member check in state and priority

* chore: remove console logs and replace harcoded values with constants

* fix: code refactoring

* fix: key name changed

* refactor: mapping through similar components using an array

* fix: build errors

---------

Co-authored-by: Aaryan Khandelwal <aaryankhandu123@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-01-18 15:49:54 +05:30
Prateek Shourya
a9e2e21641 refactor: update create/update issue modal to use currently active store's create/update method. (#3395)
* refactor: update `create/update issue` modal to use currently active store's create/update method.

* chore: add condition to avoid multiple API calls if the current store is MODULE or CYCLE.

* remove: console log

* chore: update `currentStore` to `storeType`.
2024-01-18 14:42:10 +05:30
sriram veeraghanta
e175d50ab7 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-18 13:47:16 +05:30
M. Palanikannan
75b8e3350a 🐛 fix: Hide drag handle when cursor leaves the editor container (#3401)
This fix adds support for hiding the drag handle when the cursor leaves the editor container. It improves the user experience by providing a cleaner interface and removing unnecessary visual elements especially while scrolling.

- Add `hideDragHandle` prop to `EditorContainer` component in `editor-container.tsx`.
- Implement `onMouseLeave` event handler in `EditorContainer` to invoke `hideDragHandle` function.
- Update `DragAndDrop` extension in `drag-drop.tsx` to accept a `setHideDragHandle` function as an optional parameter.
- Pass the `setHideDragHandle` function from `RichTextEditor` component to `DragAndDrop` extension in `RichTextEditorExtensions` function in `index.tsx`.
- Set `hideDragHandleOnMouseLeave` state in `RichTextEditor` component to store the `hideDragHandlerFromDragDrop` function.
- Create `setHideDragHandleFunction` callback function in `RichTextEditor` to update the `hideDragHandleOnMouseLeave` state.
- Pass `hideDragHandleOnMouseLeave` as `hideDragHandle` prop to `EditorContainer` component in `RichTextEditor`.
2024-01-18 12:43:43 +05:30
Prateek Shourya
6e1cd4194a fix: stack integration disable button mutation issue in project settings. (#3402) 2024-01-18 12:31:10 +05:30
Anmol Singh Bhatia
615ccf9459 chore: workspace active cycles improvement (#3396) 2024-01-17 23:04:53 +05:30
sriram veeraghanta
13362590b6 fix: resolved merge conflicts while moving changes from preview to develop 2024-01-17 19:13:47 +05:30
Prateek Shourya
7833ca7bea fix: project views bugs related to store refactor. (#3391)
* chore: remove debounce logic to fix create/ update view modal bugs.

* fix: bug in delete views not mutating the store.

* chore: replace `Project Empty State` with `Project Views Empty State`.

* chore: add issue peek overview.

* refactor: issue update, delete actions for project views layout.
fix: issue update and delete action throwing error bug.
fix: issue quick add throwing error bug.
2024-01-17 18:37:46 +05:30
M. Palanikannan
a1d27a1bf0 [chore]: Removed explicit dependencies and cleaned up turbo config (#3388)
* Removed explicit dependencies and cleaned up turbo config

* fix: upgrade turbo

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-01-17 17:24:56 +05:30
rahulramesha
8fbd4a059b fix: refactor related bugs (#3384)
* fix sub issues inside issue detail

* close peek over view after opening issue detail

* fix error while opening peek overview

* fix saving project views

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-16 21:16:12 +05:30
Anmol Singh Bhatia
e751686683 chore: filter and display properties improvement (#3382) 2024-01-16 20:57:55 +05:30
Anmol Singh Bhatia
8ee5ba96ce dev: workspace active cycles (#3378)
* chore: workspace active cycles

* fix: active cycles tab implementation

* chore: added distribution graph for active cycles

* chore: removed distribution graph and issues

* Revert "chore: removed issues"

This reverts commit 7d977ac8b0.

* chore: workspace active cycles implementation

* chore: code refactor

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-01-16 19:54:32 +05:30
Anmol Singh Bhatia
9e8885df5f chore: esc to close peek overview added (#3380) 2024-01-16 18:23:42 +05:30
Anmol Singh Bhatia
bc48010377 fix: drag and delete issue (#3379) 2024-01-16 18:22:53 +05:30
Lakhan Baheti
9fde539b1d chore webhook create page removed (#3376)
* chore webhook create page removed

* fix: removed unused variables
2024-01-16 14:22:48 +05:30
guru_sainath
ec26bf6e68 chore: update in sub-issues component and property validation and issue loaders (#3375)
* fix: handled undefined issue_id in list layout

* chore: refactor peek overview and user role validation.

* chore: sub issues

* fix: sub issues state distribution changed

* chore: sub_issues implementation in issue detail page

* chore: fixes in cycle/ module layout.
* Fix progress chart
* Module issues's update/ delete.
* Peek Overview for Modules/ Cycle.
* Fix Cycle Filters not applying bug.

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-01-16 12:46:03 +05:30
sriram veeraghanta
e9ef3fb32a chore: formatting all python files using black formatter (#3366) 2024-01-13 19:05:06 +05:30
rahulramesha
ee2c7c5fa1 enable peekoverview for spreadsheet and minor refactor for faster opening of the peekoverview component (#3361)
Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-12 13:52:04 +05:30
rahulramesha
d64ae9a2e4 fix: project loaders for mobx store (#3356)
* add loaders to all the dropdowns outside project wrpper

* fix build errors

* minor refactor for project states color

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-12 13:51:00 +05:30
Henit Chobisa
f58a00a4ab [FIX] Pages Malfunctioning on Load and Recent Pages Computation (#3359)
* fix: fixed `usePage` hook returning context instead of IPageStore

* fix: updated recent pages with `updated_at` instead of `created_at`

* fix: thown error instead of returning empty array
2024-01-12 13:26:48 +05:30
sriram veeraghanta
a3e5284f71 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-12 12:25:57 +05:30
sriram veeraghanta
1c06c3f43e fix: create more toggle fixes in create issue modal (#3355)
* fix: create more issue bugfixes

* fix: removing all warning
2024-01-11 21:01:05 +05:30
sriram veeraghanta
da1496fe65 fix: create sync action (#3353)
* fix: create sync action changes

* fix: typo changes
2024-01-11 18:40:26 +05:30
M. Palanikannan
3d489e186f fix: inline code blocks, code blocks and links have saner behaviour (#3318)
* fix: removed backticks in inline code blocks

* added better error handling while cancelling uploads

* fix: inline code blocks, code blocks and links have saner behaviour

- Inline code blocks are now exitable, don't have backticks, have better padding vertically and better regex matching
- Code blocks on the top and bottom of the document are now exitable via Up and Down Arrow keys
- Links are now exitable while being autolinkable via a custom re-write of the tiptap-link-extension

* fix: more robust link checking
2024-01-11 18:29:41 +05:30
guru_sainath
57d5ff7646 chore: Error Handling and Validation Updates (#3351)
* fix: handled undefined issue_id in list layout

* chore: updated label select dropdown in the issue detail

* fix: peekoverview issue is resolved

* chore: user role validation for issue details.

* fix: Link, Attachement, parent mutation

* build-error: build error resolved in peekoverview

* chore: user role validation for issue details.

* chore: user role validation for `issue description`, `parent`, `relation` and `subscription`.

* chore: issue subscription mutation

* chore: user role validation for `labels` in issue details.

---------

Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
2024-01-11 18:26:58 +05:30
rahulramesha
3c9926d383 update swr config to not fetch everything on focus (#3350)
Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-11 18:21:41 +05:30
rahulramesha
ece4d5b1ed chore: Refactor Spreadsheet view for better code maintainability and performance (#3322)
* refcator spreadsheet to use table and roow based approach rather than column based

* update spreadsheet and optimized layout

* fix issues in spread sheet

* close quick action menu on click

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-11 18:19:19 +05:30
guru_sainath
73eed69aa6 chore: refactored and resolved build issues on the issues and issue detail page (#3340)
* fix: handled undefined issue_id in list layout

* dev: issue detail store and optimization

* dev: issue filter and list operations

* fix: typo on labels update

* dev: Handled all issues in the list layout in project issues

* dev: handled kanban and auick add issue in swimlanes

* chore: fixed peekoverview in kanban

* chore: fixed peekoverview in calendar

* chore: fixed peekoverview in gantt

* chore: updated quick add in the gantt chart

* chore: handled issue detail properties and resolved build issues

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-01-10 20:09:45 +05:30
Henit Chobisa
09603cf189 fix: link preview editor (#3335)
* feat: added link preview plugin in document editor

* fix: readonly editor page renderer css

* fix: autolink issue with links

* chore: added floating UI

* feat: added link preview components

* feat: added floating UI to page renderer for link previews

* feat: added actionCompleteHandler to page renderer

* chore: Lock file changes

* fix: regex security error

* chore: updated radix with lucid icons

---------

Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-01-10 18:18:09 +05:30
Nikhil
23e53df3ad dev: fix smtp configuration (#3339) 2024-01-10 18:16:28 +05:30
Nikhil
57594aac4e dev: update the instance urls (#3329) 2024-01-10 12:22:20 +05:30
Anmol Singh Bhatia
8b884ab681 chore: modal and dropdown improvement (#3332)
* dev: dropdown key down custom hook added

* chore: plane ui dropdowns updated

* chore: cycle and module tab index added in modals

* chore: view and page tab index added in modals

* chore: issue modal tab indexing added

* chore: project modal tab indexing added

* fix: build fix

* build-error: build error in pages new structure and reverted back to old page structure

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-01-10 12:21:24 +05:30
sriram veeraghanta
08e5f2b156 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-09 22:51:51 +05:30
sriram veeraghanta
cb3a73e515 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-09 20:41:49 +05:30
sriram veeraghanta
cb2a7d0930 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-08 23:29:36 +05:30
sriram veeraghanta
c38e048ce8 Merge branches 'fix/pages-store' and 'develop' of github.com:makeplane/plane into develop 2024-01-08 23:28:12 +05:30
Bavisetti Narayan
94b72effbf chore: mobile configs (#3328)
* chore: mobile configs

* chore: mobile configurations changed

* chore: removed the slack id

* chore: reversed google client id
2024-01-08 23:25:14 +05:30
rahulramesha
eccb1f5d10 fix: breaking cycle issues and replacing router.push with Links (#3330)
* fix cycle creation and active cycle map

* minor fix in cycle store

* create cycle breaking fix

* replace last possible bits of router.push with Link

---------

Co-authored-by: Rahul R <rahulr@Rahuls-MacBook-Pro.local>
2024-01-08 19:20:42 +05:30
Prateek Shourya
a71491ecb9 fix: estimate order not maintained in create/ update modal. (#3326)
* fix: estimate order not maintained in create/ update modal.

* fix: estimate points mutation on update.
2024-01-08 16:16:45 +05:30
sriram veeraghanta
455c2cc787 fix: pages store structure changes 2024-01-07 12:05:52 +05:30
Anmol Singh Bhatia
81f6557908 fix: workspace invitations response updated (#3321) 2024-01-05 23:42:52 +05:30
Anmol Singh Bhatia
2f10f35191 chore: bug fixes and improvement (#3303)
* refactor: updated preloaded function for the list view quick add

* fix: resolved bug in the assignee dropdown

* chore: issue sidebar link improvement

* fix: resolved subscription store bug

* chore: updated preloaded function for the kanban layout quick add

* chore: resolved issues in the list filters and component

* chore: filter store updated

* fix: issue serializer changed

* chore: quick add preload function updated

* fix: build error

* fix: serializer changed

* fix: minor request change

* chore: resolved build issues and updated the prepopulated data in the quick add issue.

* fix: build fix and code refactor

* fix: spreadsheet layout quick add fix

* fix: issue peek overview link section updated

* fix: cycle status bug fix

* fix: serializer changes

* fix: assignee and labels listing

* chore: issue modal parent_id default value updated

* fix: cycle and module issue serializer change

* fix: cycle list serializer changed

* chore: prepopulated validation in both list and kanban for quick add and group header add issues

* chore: group header validation added

* fix: issue response payload change

* dev: make cycle and module issue create response simillar

* chore: custom control link component added

* dev: make issue create and update response simillar to list and retrieve

* fix: build error

* chore: control link component improvement

* chore: globalise issue peek overview

* chore: control link component improvement

* chore: made changes and optimised the issue peek overview root

* build-error: resolved build erros for issueId dependancy from issue detail store

* chore: peek overview link fix

* dev: update state nullable rule

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
2024-01-05 23:37:13 +05:30
Prateek Shourya
cf64c7bbc6 fix: project identifier cursor behaviour in create project modal. (#3320) 2024-01-05 14:38:09 +05:30
Prateek Shourya
9dd8c8ba14 chore: UI/UX improvements (#3319)
* chore: add proper message for cycle/ module having start & end date but isn't active yet.

* fix: infinite loader after updating workspace settings.

* fix: user profile icon dropdown doesn't closes automatically.

* style: fix inconsistent padding in cycle empty state.

* chore: remove multiple `empty state` in labels settings and improve add label logic.

* style: fix inconsistent padding in project label, integration and estimates empty state.

* style: fix integrations settings breadcrumb title.

* style: add proper `disabled` styles for email field in profile settings.

* style: fix cycle layout height.
2024-01-05 14:13:04 +05:30
sriram veeraghanta
d98b688342 fix: merge conflicts resolved 2024-01-04 17:28:11 +05:30
sriram veeraghanta
ce21630388 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-04 17:27:22 +05:30
M. Palanikannan
0927fa150c chore: Updated TableView component in table extension to solve sentry (#3309)
error of table not being defined while getting getBoundingClientRect()
and solve other TS issues

- Added ResolvedPos import from @tiptap/pm/model
- Updated setCellsBackgroundColor function parameter type to string
- Declared ToolboxItem type for toolbox items
- Modified columnsToolboxItems and rowsToolboxItems to use the ToolboxItem type
- Updated createToolbox function parameters to specify Element or null for triggerButton and ToolboxItem[] for items
- Added ts-expect-error comment above the toolbox variable declaration
- Updated update method parameter type to readonly Decoration[]
- Changed destructuring assignment of hoveredTable and hoveredCell in updateControls method to use Object.values and reduce method
- Added null check for this.table in updateControls method
- Wrapped the code that updates columnsControl and rowsControl with null checks for each control
- Replaced ts-ignore comments with proper dispatch calls in selectColumn and selectRow methods

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-01-04 16:30:10 +05:30
Aaryan Khandelwal
eec411baaf dev: new create issue modal (#3312) 2024-01-04 16:29:18 +05:30
sriram veeraghanta
ecc8fbd79b fix: Login workflow depending on smtp is configured (#3307) 2024-01-04 16:27:17 +05:30
sriram veeraghanta
c9b628e578 Merge branch 'preview' of github.com:makeplane/plane into develop 2024-01-04 16:25:27 +05:30
Anmol Singh Bhatia
b522de99ba chore: profile setting improvement (#3306) 2024-01-03 18:25:41 +05:30
Prateek Shourya
b58d7a715a style: remove unnecessary vertical scroll in All Issues Tabs. (#3300) 2024-01-03 12:52:55 +05:30
Bavisetti Narayan
87cd44bcd2 fix: migration file fixes (#3302) 2024-01-02 18:52:46 +05:30
Aaryan Khandelwal
804b7d8663 refactor: MobX store structure (#3228)
* query params from router as computed

* chore: setup workspace store and sub-stores

* chore: update router query store

* chore: update store types

* fix: pages store changes

* change observables and retain object reference

* fix build errors

* chore: changed the structure of workspace, project, cycle, module and pages

* fix: pages fixes

* fix: merge conflicts resolved

* chore: fixed workspace list

* chore: update workspace store accroding to the new response

* fix: adding page details to store

* fix: adding new contexts and providers

* dev: issues store and filters in new store

* dev: optimised the issue fetching in issue base store

* chore: project views id mapped

* update lodash set to directly run inside runInaction since it mutates the object

* fix: context changes

* code refactor kanban for better mainatinability

* optimize Kanban for performance

* chore: implemented hooks for all the created stores

* chore: removed bridge id

* css change and refactor

* chore: update cycle store structure

* chore: implement the new label root store

* chore: removed object structure

* chore: implement project view hook

* Kanban new store implementation for project issues

* fix project root for kanban

* feat: workspace and project members endpoint (#3092)

* fix: merge conflicts resolved

* issue properties optimization

* chore: user stores

* chore: create new store context and update hooks

* chore: setup inbox store and implement router store

* chore: initialize and implement project estimate store

* chore: initialize global view store

* kanban and list view optimization

* chore: use new cycle and module store. (#3172)

* chore: use new cycle and module store.

* chore: minor improvements.

* Revert "chore: merge develop"

This reverts commit 9d2e0e29e7, reversing
changes made to 9595493c42.

* chore: implement useGlobalView hook

* refactor: projects & inbox store instances (#3179)

* refactor: projects & inbox store instances

* fix: formatting

* fix: action usage

* chore: implement useProjectState hook. (#3185)

* dev: issue, cycle store optimiation

* fix build for code

* dev: removed dummy variables

* dev: issue store

* fix: adding todos

* chore: removing legacy store

* dev: issues store types and typos

* chore: cycle module user properties

* fix legacy store deletion issues

* chore: change POST to PATCH

* fix issues rendering for project root

* chore: removed workspace details in workpsaceinvite

* chore: created models for display properties

* chore: setup member store and implement it everywhere

* refactor: module store (#3202)

* refactor: cycle store (#3192)

* refator: cycle store

* some more improvements.

* chore: implement useLabel hook. (#3190)

* refactor: inbox & project related stores. (#3193)

* refactor: inbox -> filter, issues, inoxes & project -> publish, projects store

* refactor: workspace-project-id name

* fix kanban dropdown overlapping issue

* fix kanban layout minor re rendering

* chore: implement useMember store everywhere

* chore: create and implement editor mention store

* chore: removed the issue view user property

* chore: created at id changed

* dev: segway intgegration (#3132)

* feat: implemented rabbitmq

* dev: initialize segway with queue setup

* dev: import refactors

* dev: create communication with the segway server

* dev: create new workers

* dev: create celery node queue for consuming messages from django

* dev: node to celery connection

* dev: setup segway and django connection

* dev: refactor the structure and add database integration to the app

* dev: add external id and source added

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>

* dev: github importer (#3205)

* dev: initiate github import

* dev: github importer all issues import

* dev: github comments and links for the imported issues

* dev: update controller to use logger and spread the resultData in getAllEntities

* dev: removed console log

* dev: update code structure and sync functions

* dev: updated retry logic when exception

* dev: add imported data as well

* dev: update logger and repo fetch

* dev: update jira integration to new structure

* dev: update migrations

* dev: update the reason field

* chore: workspace object id removed

* chore: view's creation fixed

* refactor: mobx store improvements. (#3213)

* fix: state and label errors

* chore: remove legacy code

* fix: branch build fix (#3214)

* branch build fix for release-* in case of space,backend,proxy

* fixes

* chore: update store names and types

* fix - file size limit not work on plane.settings.production (#3160)

* fix - file size limit not work on plane.settings.production

* fix - file size limit not work on plane.settings.production

* fix - file size limit not work on plane.settings.production, move to common.py

---------

Co-authored-by: luanduongtel4vn <hoangluan@tel4vn.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>

* style: instance admin email settings ui & ux update. (#3186)

* refactor: use-user-auth hook (#3215)

* refactor: use-user-auth hook

* fix: user store currentUserLoader

* refactor: project-view & application related stores (#3207)

* refactor: project-view & application related stores

* rename: projectViews -> projectViewIds

* fix: project-view favourite state in store

* chore: remove unnecessary hooks and contexts (#3217)

* chore: update issue assignee property component

* chore: bug fixes & improvement (#3218)

* chore: draft issue validation added to prevent saving empty or whitespace title

* chore: resolve scrolling issue in page empty state

* chore: kanban layout quick add issue improvement

* fix: bugs & improvements (#3189)

* fix: workspace invitation modal form values reset

* fix: profile sidebar avatar letter

* [refactor] Editor code refactoring (#3194)

* removed relative imports from editor core

* Update issue widget file paths and imports to use kebab case instead of camel case, to align with coding conventions and improve consistency.

* Update Tiptap core and extensions versions to 2.1.13 and Tiptap React version to 2.1.13. Update Tiptap table imports to use the new location in package @tiptap/pm/tables. Update AlertLabel component to use the new type definition for LucideIcon.

* updated lock file

* removed default exports from editor/core

* fixed injecting css into the core package itself

* seperated css code to have single source of origin wrt to the package

* removed default imports from document editor

* all instances using index as key while mapping fixed

* Update Lite Text Editor package.json to remove @plane/editor-types as a dependency.

Update Lite Text Editor index.ts to update the import of IMentionSuggestion and IMentionHighlight from @plane/editor-types to @plane/editor-core.

Update Lite Text Editor ui/index.tsx to update the import of UploadImage, DeleteImage, IMentionSuggestion, and RestoreImage from @plane/editor-types to @plane/editor-core.

Update Lite Text Editor ui/menus/fixed-menu/index.tsx to update the import of UploadImage from @plane/editor-types to @plane/editor-core.

Update turbo.json to remove @plane/editor-types#build as a dependency for @plane/lite-text-editor#build, @plane/rich-text-editor#build, and @plane/document-editor#build.

* Remove deprecated import and adjust tippy.js usage in the slash-commands.tsx file of the editor extensions package.

* Update dependencies in `rich-text-editor/package.json`, remove `@plane/editor-types` and add `@plane/editor-core` in `rich-text-editor/src/index.ts`, and update imports in `rich-text-editor/src/ui/extensions/index.tsx` and `rich-text-editor/src/ui/index.tsx` to use `@plane/editor-core` instead of `@plane/editor-types`.

* Update package.json dependencies and add new types for image deletion, upload, restore, mention highlight, mention suggestion, and slash command item.

* Update import statements in various files to use the new package "@plane/editor-core" instead of "@plane/editor-types".

* fixed document editor to follow conventions

* Refactor imports in the Rich Text Editor package to use relative paths instead of absolute paths.

- Updated imports in `index.ts`, `ui/index.tsx`, and `ui/menus/bubble-menu/index.tsx` to use relative paths.
- Updated `tsconfig.json` to include the `baseUrl` compiler option and adjust the `include` and `exclude` paths.

* Refactor Lite Text Editor code to use relative import paths instead of absolute import paths.

* Added LucideIconType to the exports in index.ts for use in other files.
Created a new file lucide-icon.ts which contains the type LucideIconType.
Updated the icon type in HeadingOneItem in menu-items/index.tsx to use LucideIconType.
Updated the Icon type in AlertLabel in alert-label.tsx to use LucideIconType.
Updated the Icon type in VerticalDropdownItemProps in vertical-dropdown-menu.tsx to use LucideIconType.
Updated the Icon type in BubbleMenuItem in fixed-menu/index.tsx to use LucideIconType.
Deleted the file tooltip.tsx since it is no longer used.
Updated the Icon type in BubbleMenuItem in bubble-menu/index.tsx to use LucideIconType.

* ♻️ refactor: simplify rendering logic in slash-commands.tsx

The rendering logic in the file "slash-commands.tsx" has been simplified. Previously, the code used inline positioning for the popup, but it has now been removed. Instead of appending the popup to the document body, it is now appended to the element with the ID "tiptap-container". The "flip" option has also been removed. These changes have improved the readability and maintainability of the code.

* fixed build errors caused due to core's internal imports

* regression: fixed pages not saving issue and not duplicating with proper content issue

* build: Update @tiptap dependencies

Updated the @tiptap dependencies in the package.json files of `document-editor`, `extensions`, and `rich-text-editor` packages to version 2.1.13.

* 🚑 fix: Correct appendTo selector in slash-commands.tsx

Update the `appendTo` function call in `slash-commands.tsx` to use the correct selector `#editor-container` instead of `#tiptap-container`. This ensures that the component is appended to the appropriate container in the editor extension.

Note: The commit message assumes that the change is a fix for an issue or error. If it's not a fix, please provide more context so that an appropriate commit type can be determined.

* style: email placeholder changed across the platform (#3206)

* style: email placeholder changed across the platform

* fix: placeholder text

* dev: updated new filter endpoints and restructured issue and issue filters store

* implement issues and replace useMobxStore

* remove all store legacy references

* dev: updated the orderby and subgroupby filters data

* dev:added projectId in issue filters for consistency

* fix more build errors

* dev: updated profile issues

* dev: removed store legacy

* dev: active cycle issues in the cycle issue store

* fix additional build errors and memoize issueActions in each layout component

* change store enums

* remove all useMobxStore references

* fix more build errors

* dev: reverted workspace invitation

* fix: build errors and warnings

* fix: optimistic update for instant operations (#3221)

* fix: update functions failed case

* fix: typo

* chore: revert back to optimistic update approach for all `update related actions` (#3219)

* fix: merge conflicts resolved

* chore: update memberMap logic in components

* add assignees to kanban groups and properties

* dev: migration fixes

* final bit of optimization on list view

* change all TODOs that are to be done before this release to FIXME

* change base Kanban TODOs that are to be done before this release to FIXME

* dev: add fields and expand for app serializers

* dev: issue detail store

* dev: update issue serializer to return object ids

* fix: Instance key added in settings and converted issues list api to arry instead of dict

* fix: removing segway files

* dev: control expand through query parameters

* revert: github importer

* Revert "dev: segway intgegration (#3132)"

This reverts commit 1cc18a0915.

* dev: remove migrations for segway

* dev: issue structure change and created workspacebasemodel

* dev: issue detail serializer

* fix: changed workspace dict

* dev: updated new issue structure

* chore: build fix

* dev: issue detail store refactor

* dev: created list endpoint for issue-relation

* dev: added issue attachments in issue detail store

* dev: added issue activity computed

* fix: build error

* chore: peek overview modal context added

* chore: build error fix

* dev: added sub_issues in issue details store

* dev: added complete issue serializer for sub issues

* dev: resolved type errors in issue root store

* dev: changed the issue relation structure

* chore: new global dropdowns

* chore: build error fix

* chore: cycle and module selection if disabled

* dev: removed unnecessary code from the workspace root

* chore: build error fix

* chore: issue relation remove endpoint

* fix: build error

* dev: typos and implemented issue relation store

* fix: yarn lock updated

* style: update the UI of all the dropdowns

* fix: state store fixes

* fix: key issue

* fix: state store console logs removed

* refactor: member dropdowns

* fix: moving types to packages

* fix: dropdown arrow positioning

* dev: removed logs

* style: label dropdown

* chore: restrict description notifications

* chore: description changes

* chore: update spreadsheet layout dropdowns

* fix: build errors

* chore: duplicate key change

* fix: ui bugs

* chore: relation activity change

* chore: comment activity changes

* chore: blocking issue removal

* chore: added project_id for relation

* chore: issue relation store and component

* chore: issue redirection issue in the issue realtion in detail page

* chore: created activity changed

* chore: issue links new store implementation on the issue detail

* chore: issue relation deletion acitivity changed

* chore: issue attachments new store implementation on the issue detail

* chore: workspace level issues

* fix: build errors

---------

Co-authored-by: rahulramesha <rahulramesham@gmail.com>
Co-authored-by: gurusainath <gurusainath007@gmail.com>
Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
Co-authored-by: Bavisetti Narayan <72156168+NarayanBavisetti@users.noreply.github.com>
Co-authored-by: Prateek Shourya <prateekshourya29@gmail.com>
Co-authored-by: Lakhan Baheti <94619783+1akhanBaheti@users.noreply.github.com>
Co-authored-by: Nikhil <118773738+pablohashescobar@users.noreply.github.com>
Co-authored-by: Manish Gupta <59428681+mguptahub@users.noreply.github.com>
Co-authored-by: Hoang Luan <luandnh98@gmail.com>
Co-authored-by: luanduongtel4vn <hoangluan@tel4vn.com>
Co-authored-by: Anmol Singh Bhatia <121005188+anmolsinghbhatia@users.noreply.github.com>
Co-authored-by: M. Palanikannan <73993394+Palanikannan1437@users.noreply.github.com>
Co-authored-by: pablohashescobar <nikhilschacko@gmail.com>
Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2024-01-02 18:12:55 +05:30
Prateek Shourya
1539340113 chore: date and time standardization all across the platform. (#3283)
* chore: date and time standardization all across the platform.

* chore: update `renderFormattedTime` function.
* remove unwanted code.

* fix: build errors

* chore: update `renderFormattedTime` function params.
2024-01-02 14:45:51 +05:30
Anmol Singh Bhatia
d9ee692ce9 chore: gpt modal refactor (#3276)
* chore: gpt modal refactor

* chore: refactored gpt assistant modal to popover component
2024-01-02 13:07:12 +05:30
1338 changed files with 55483 additions and 47722 deletions

View File

@@ -10,7 +10,7 @@ env:
SOURCE_BRANCH_NAME: ${{github.event.pull_request.base.ref}}
jobs:
create_pr:
sync_changes:
# Only run the job when a PR is merged
if: github.event.pull_request.merged == true
runs-on: ubuntu-latest

View File

@@ -39,7 +39,6 @@ OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1 # deprecated

View File

@@ -26,7 +26,9 @@ def update_description():
updated_issues.append(issue)
Issue.objects.bulk_update(
updated_issues, ["description_html", "description_stripped"], batch_size=100
updated_issues,
["description_html", "description_stripped"],
batch_size=100,
)
print("Success")
except Exception as e:
@@ -40,7 +42,9 @@ def update_comments():
updated_issue_comments = []
for issue_comment in issue_comments:
issue_comment.comment_html = f"<p>{issue_comment.comment_stripped}</p>"
issue_comment.comment_html = (
f"<p>{issue_comment.comment_stripped}</p>"
)
updated_issue_comments.append(issue_comment)
IssueComment.objects.bulk_update(
@@ -99,7 +103,9 @@ def updated_issue_sort_order():
issue.sort_order = issue.sequence_id * random.randint(100, 500)
updated_issues.append(issue)
Issue.objects.bulk_update(updated_issues, ["sort_order"], batch_size=100)
Issue.objects.bulk_update(
updated_issues, ["sort_order"], batch_size=100
)
print("Success")
except Exception as e:
print(e)
@@ -137,7 +143,9 @@ def update_project_cover_images():
project.cover_image = project_cover_images[random.randint(0, 19)]
updated_projects.append(project)
Project.objects.bulk_update(updated_projects, ["cover_image"], batch_size=100)
Project.objects.bulk_update(
updated_projects, ["cover_image"], batch_size=100
)
print("Success")
except Exception as e:
print(e)
@@ -186,7 +194,9 @@ def update_label_color():
def create_slack_integration():
try:
_ = Integration.objects.create(provider="slack", network=2, title="Slack")
_ = Integration.objects.create(
provider="slack", network=2, title="Slack"
)
print("Success")
except Exception as e:
print(e)
@@ -212,12 +222,16 @@ def update_integration_verified():
def update_start_date():
try:
issues = Issue.objects.filter(state__group__in=["started", "completed"])
issues = Issue.objects.filter(
state__group__in=["started", "completed"]
)
updated_issues = []
for issue in issues:
issue.start_date = issue.created_at.date()
updated_issues.append(issue)
Issue.objects.bulk_update(updated_issues, ["start_date"], batch_size=500)
Issue.objects.bulk_update(
updated_issues, ["start_date"], batch_size=500
)
print("Success")
except Exception as e:
print(e)

View File

@@ -2,10 +2,10 @@
import os
import sys
if __name__ == '__main__':
if __name__ == "__main__":
os.environ.setdefault(
'DJANGO_SETTINGS_MODULE',
'plane.settings.production')
"DJANGO_SETTINGS_MODULE", "plane.settings.production"
)
try:
from django.core.management import execute_from_command_line
except ImportError as exc:

View File

@@ -1,3 +1,3 @@
from .celery import app as celery_app
__all__ = ('celery_app',)
__all__ = ("celery_app",)

View File

@@ -2,4 +2,4 @@ from django.apps import AppConfig
class AnalyticsConfig(AppConfig):
name = 'plane.analytics'
name = "plane.analytics"

View File

@@ -25,7 +25,10 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
Q(
Q(expired_at__gt=timezone.now())
| Q(expired_at__isnull=True)
),
token=token,
is_active=True,
)

View File

@@ -1,17 +1,18 @@
from rest_framework.throttling import SimpleRateThrottle
class ApiKeyRateThrottle(SimpleRateThrottle):
scope = 'api_key'
rate = '60/minute'
scope = "api_key"
rate = "60/minute"
def get_cache_key(self, request, view):
# Retrieve the API key from the request header
api_key = request.headers.get('X-Api-Key')
api_key = request.headers.get("X-Api-Key")
if not api_key:
return None # Allow the request if there's no API key
# Use the API key as part of the cache key
return f'{self.scope}:{api_key}'
return f"{self.scope}:{api_key}"
def allow_request(self, request, view):
allowed = super().allow_request(request, view)
@@ -35,7 +36,7 @@ class ApiKeyRateThrottle(SimpleRateThrottle):
reset_time = int(now + self.duration)
# Add headers
request.META['X-RateLimit-Remaining'] = max(0, available)
request.META['X-RateLimit-Reset'] = reset_time
request.META["X-RateLimit-Remaining"] = max(0, available)
request.META["X-RateLimit-Reset"] = reset_time
return allowed

View File

@@ -13,5 +13,9 @@ from .issue import (
)
from .state import StateLiteSerializer, StateSerializer
from .cycle import CycleSerializer, CycleIssueSerializer, CycleLiteSerializer
from .module import ModuleSerializer, ModuleIssueSerializer, ModuleLiteSerializer
from .module import (
ModuleSerializer,
ModuleIssueSerializer,
ModuleLiteSerializer,
)
from .inbox import InboxIssueSerializer

View File

@@ -100,6 +100,8 @@ class BaseSerializer(serializers.ModelSerializer):
response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
response[expand] = getattr(instance, f"{expand}_id", None)
response[expand] = getattr(
instance, f"{expand}_id", None
)
return response

View File

@@ -23,7 +23,9 @@ class CycleSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
raise serializers.ValidationError(
"Start date cannot exceed end date"
)
return data
class Meta:
@@ -55,7 +57,6 @@ class CycleIssueSerializer(BaseSerializer):
class CycleLiteSerializer(BaseSerializer):
class Meta:
model = Cycle
fields = "__all__"

View File

@@ -2,8 +2,8 @@
from .base import BaseSerializer
from plane.db.models import InboxIssue
class InboxIssueSerializer(BaseSerializer):
class InboxIssueSerializer(BaseSerializer):
class Meta:
model = InboxIssue
fields = "__all__"

View File

@@ -27,6 +27,7 @@ from .module import ModuleSerializer, ModuleLiteSerializer
from .user import UserLiteSerializer
from .state import StateLiteSerializer
class IssueSerializer(BaseSerializer):
assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(
@@ -66,12 +67,14 @@ class IssueSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
try:
if(data.get("description_html", None) is not None):
if data.get("description_html", None) is not None:
parsed = html.fromstring(data["description_html"])
parsed_str = html.tostring(parsed, encoding='unicode')
parsed_str = html.tostring(parsed, encoding="unicode")
data["description_html"] = parsed_str
except Exception as e:
@@ -96,7 +99,8 @@ class IssueSerializer(BaseSerializer):
if (
data.get("state")
and not State.objects.filter(
project_id=self.context.get("project_id"), pk=data.get("state").id
project_id=self.context.get("project_id"),
pk=data.get("state").id,
).exists()
):
raise serializers.ValidationError(
@@ -107,7 +111,8 @@ class IssueSerializer(BaseSerializer):
if (
data.get("parent")
and not Issue.objects.filter(
workspace_id=self.context.get("workspace_id"), pk=data.get("parent").id
workspace_id=self.context.get("workspace_id"),
pk=data.get("parent").id,
).exists()
):
raise serializers.ValidationError(
@@ -238,9 +243,13 @@ class IssueSerializer(BaseSerializer):
]
if "labels" in self.fields:
if "labels" in self.expand:
data["labels"] = LabelSerializer(instance.labels.all(), many=True).data
data["labels"] = LabelSerializer(
instance.labels.all(), many=True
).data
else:
data["labels"] = [str(label.id) for label in instance.labels.all()]
data["labels"] = [
str(label.id) for label in instance.labels.all()
]
return data
@@ -278,7 +287,8 @@ class IssueLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
url=validated_data.get("url"),
issue_id=validated_data.get("issue_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -324,9 +334,9 @@ class IssueCommentSerializer(BaseSerializer):
def validate(self, data):
try:
if(data.get("comment_html", None) is not None):
if data.get("comment_html", None) is not None:
parsed = html.fromstring(data["comment_html"])
parsed_str = html.tostring(parsed, encoding='unicode')
parsed_str = html.tostring(parsed, encoding="unicode")
data["comment_html"] = parsed_str
except Exception as e:
@@ -362,7 +372,6 @@ class ModuleIssueSerializer(BaseSerializer):
class LabelLiteSerializer(BaseSerializer):
class Meta:
model = Label
fields = [

View File

@@ -52,7 +52,9 @@ class ModuleSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
if data.get("members", []):
data["members"] = ProjectMember.objects.filter(
@@ -146,7 +148,8 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if ModuleLink.objects.filter(
url=validated_data.get("url"), module_id=validated_data.get("module_id")
url=validated_data.get("url"),
module_id=validated_data.get("module_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -155,7 +158,6 @@ class ModuleLinkSerializer(BaseSerializer):
class ModuleLiteSerializer(BaseSerializer):
class Meta:
model = Module
fields = "__all__"

View File

@@ -2,12 +2,17 @@
from rest_framework import serializers
# Module imports
from plane.db.models import Project, ProjectIdentifier, WorkspaceMember, State, Estimate
from plane.db.models import (
Project,
ProjectIdentifier,
WorkspaceMember,
State,
Estimate,
)
from .base import BaseSerializer
class ProjectSerializer(BaseSerializer):
total_members = serializers.IntegerField(read_only=True)
total_cycles = serializers.IntegerField(read_only=True)
total_modules = serializers.IntegerField(read_only=True)
@@ -21,7 +26,7 @@ class ProjectSerializer(BaseSerializer):
fields = "__all__"
read_only_fields = [
"id",
'emoji',
"emoji",
"workspace",
"created_at",
"updated_at",
@@ -59,12 +64,16 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(detail="Project Identifier is required")
raise serializers.ValidationError(
detail="Project Identifier is required"
)
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(detail="Project Identifier is taken")
raise serializers.ValidationError(
detail="Project Identifier is taken"
)
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]

View File

@@ -7,9 +7,9 @@ class StateSerializer(BaseSerializer):
def validate(self, data):
# If the default is being provided then make all other states default False
if data.get("default", False):
State.objects.filter(project_id=self.context.get("project_id")).update(
default=False
)
State.objects.filter(
project_id=self.context.get("project_id")
).update(default=False)
return data
class Meta:

View File

@@ -5,6 +5,7 @@ from .base import BaseSerializer
class WorkspaceLiteSerializer(BaseSerializer):
"""Lite serializer with only required fields"""
class Meta:
model = Workspace
fields = [

View File

@@ -41,7 +41,9 @@ class WebhookMixin:
bulk = False
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
response = super().finalize_response(
request, response, *args, **kwargs
)
# Check for the case should webhook be sent
if (
@@ -139,7 +141,9 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
def finalize_response(self, request, response, *args, **kwargs):
# Call super to get the default response
response = super().finalize_response(request, response, *args, **kwargs)
response = super().finalize_response(
request, response, *args, **kwargs
)
# Add custom headers if they exist in the request META
ratelimit_remaining = request.META.get("X-RateLimit-Remaining")
@@ -163,13 +167,17 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
@property
def fields(self):
fields = [
field for field in self.request.GET.get("fields", "").split(",") if field
field
for field in self.request.GET.get("fields", "").split(",")
if field
]
return fields if fields else None
@property
def expand(self):
expand = [
expand for expand in self.request.GET.get("expand", "").split(",") if expand
expand
for expand in self.request.GET.get("expand", "").split(",")
if expand
]
return expand if expand else None

View File

@@ -12,7 +12,13 @@ from rest_framework import status
# Module imports
from .base import BaseAPIView, WebhookMixin
from plane.db.models import Cycle, Issue, CycleIssue, IssueLink, IssueAttachment
from plane.db.models import (
Cycle,
Issue,
CycleIssue,
IssueLink,
IssueAttachment,
)
from plane.app.permissions import ProjectEntityPermission
from plane.api.serializers import (
CycleSerializer,
@@ -102,7 +108,9 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
),
)
)
.annotate(total_estimates=Sum("issue_cycle__issue__estimate_point"))
.annotate(
total_estimates=Sum("issue_cycle__issue__estimate_point")
)
.annotate(
completed_estimates=Sum(
"issue_cycle__issue__estimate_point",
@@ -201,7 +209,8 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
# Incomplete Cycles
if cycle_view == "incomplete":
queryset = queryset.filter(
Q(end_date__gte=timezone.now().date()) | Q(end_date__isnull=True),
Q(end_date__gte=timezone.now().date())
| Q(end_date__isnull=True),
)
return self.paginate(
request=request,
@@ -238,8 +247,12 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
project_id=project_id,
owned_by=request.user,
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else:
return Response(
{
@@ -249,15 +262,22 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
)
def patch(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
request_data = request.data
if cycle.end_date is not None and cycle.end_date < timezone.now().date():
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if "sort_order" in request_data:
# Can only change sort order
request_data = {
"sort_order": request_data.get("sort_order", cycle.sort_order)
"sort_order": request_data.get(
"sort_order", cycle.sort_order
)
}
else:
return Response(
@@ -275,11 +295,13 @@ class CycleAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, pk):
cycle_issues = list(
CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list(
"issue", flat=True
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk")
).values_list("issue", flat=True)
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue_activity.delay(
type="cycle.activity.deleted",
@@ -319,7 +341,9 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
CycleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("issue_id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -342,7 +366,9 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -364,7 +390,9 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -387,14 +415,18 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
if not issues:
return Response(
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Issues are required"},
status=status.HTTP_400_BAD_REQUEST,
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
if cycle.end_date is not None and cycle.end_date < timezone.now().date():
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
return Response(
{
"error": "The Cycle has already been completed so no new issues can be added"
@@ -479,7 +511,10 @@ class CycleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get(
issue_id=issue_id, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
cycle_id=cycle_id,
)
issue_id = cycle_issue.issue_id
cycle_issue.delete()

View File

@@ -14,7 +14,14 @@ from rest_framework.response import Response
from .base import BaseAPIView
from plane.app.permissions import ProjectLitePermission
from plane.api.serializers import InboxIssueSerializer, IssueSerializer
from plane.db.models import InboxIssue, Issue, State, ProjectMember, Project, Inbox
from plane.db.models import (
InboxIssue,
Issue,
State,
ProjectMember,
Project,
Inbox,
)
from plane.bgtasks.issue_activites_task import issue_activity
@@ -43,7 +50,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
).first()
project = Project.objects.get(
workspace__slug=self.kwargs.get("slug"), pk=self.kwargs.get("project_id")
workspace__slug=self.kwargs.get("slug"),
pk=self.kwargs.get("project_id"),
)
if inbox is None and not project.inbox_view:
@@ -51,7 +59,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
return (
InboxIssue.objects.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
Q(snoozed_till__gte=timezone.now())
| Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=inbox.id,
@@ -87,7 +96,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
def post(self, request, slug, project_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Name is required"},
status=status.HTTP_400_BAD_REQUEST,
)
inbox = Inbox.objects.filter(
@@ -117,7 +127,8 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"none",
]:
return Response(
{"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Invalid priority"},
status=status.HTTP_400_BAD_REQUEST,
)
# Create or get state
@@ -222,10 +233,14 @@ class InboxIssueAPIEndpoint(BaseAPIView):
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
"description": issue_data.get(
"description", issue.description
),
}
issue_serializer = IssueSerializer(issue, data=issue_data, partial=True)
issue_serializer = IssueSerializer(
issue, data=issue_data, partial=True
)
if issue_serializer.is_valid():
current_instance = issue
@@ -266,7 +281,9 @@ class InboxIssueAPIEndpoint(BaseAPIView):
project_id=project_id,
)
state = State.objects.filter(
group="cancelled", workspace__slug=slug, project_id=project_id
group="cancelled",
workspace__slug=slug,
project_id=project_id,
).first()
if state is not None:
issue.state = state
@@ -284,17 +301,22 @@ class InboxIssueAPIEndpoint(BaseAPIView):
if issue.state.name == "Triage":
# Move to default state
state = State.objects.filter(
workspace__slug=slug, project_id=project_id, default=True
workspace__slug=slug,
project_id=project_id,
default=True,
).first()
if state is not None:
issue.state = state
issue.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else:
return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
InboxIssueSerializer(inbox_issue).data,
status=status.HTTP_200_OK,
)
def delete(self, request, slug, project_id, issue_id):

View File

@@ -67,7 +67,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -86,7 +88,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get(self, request, slug, project_id, pk=None):
if pk:
issue = Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -102,7 +106,13 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
@@ -117,7 +127,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -127,7 +139,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -175,7 +189,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
@@ -209,7 +225,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
# Track the issue
issue_activity.delay(
type="issue.activity.created",
requested_data=json.dumps(self.request.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
@@ -220,7 +238,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
project = Project.objects.get(pk=project_id)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
@@ -250,7 +270,9 @@ class IssueAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
@@ -297,11 +319,17 @@ class LabelAPIEndpoint(BaseAPIView):
serializer = LabelSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError:
return Response(
{"error": "Label with the same name already exists in the project"},
{
"error": "Label with the same name already exists in the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -318,7 +346,11 @@ class LabelAPIEndpoint(BaseAPIView):
).data,
)
label = self.get_queryset().get(pk=pk)
serializer = LabelSerializer(label, fields=self.fields, expand=self.expand,)
serializer = LabelSerializer(
label,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
def patch(self, request, slug, project_id, pk=None):
@@ -329,7 +361,6 @@ class LabelAPIEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, project_id, pk=None):
label = self.get_queryset().get(pk=pk)
label.delete()
@@ -395,7 +426,9 @@ class IssueLinkAPIEndpoint(BaseAPIView):
)
issue_activity.delay(
type="link.activity.created",
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
serializer.data, cls=DjangoJSONEncoder
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -407,14 +440,19 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
requested_data = json.dumps(request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
cls=DjangoJSONEncoder,
)
serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
serializer = IssueLinkSerializer(
issue_link, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
@@ -431,7 +469,10 @@ class IssueLinkAPIEndpoint(BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
@@ -466,7 +507,9 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
IssueComment.objects.filter(workspace__slug=self.kwargs.get("slug"))
IssueComment.objects.filter(
workspace__slug=self.kwargs.get("slug")
)
.filter(project_id=self.kwargs.get("project_id"))
.filter(issue_id=self.kwargs.get("issue_id"))
.filter(project__project_projectmember__member=self.request.user)
@@ -518,7 +561,9 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
)
issue_activity.delay(
type="comment.activity.created",
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
serializer.data, cls=DjangoJSONEncoder
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -530,7 +575,10 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def patch(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
@@ -556,7 +604,10 @@ class IssueCommentAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data,

View File

@@ -55,7 +55,9 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
.prefetch_related(
Prefetch(
"link_module",
queryset=ModuleLink.objects.select_related("module", "created_by"),
queryset=ModuleLink.objects.select_related(
"module", "created_by"
),
)
)
.annotate(
@@ -122,7 +124,13 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
def post(self, request, slug, project_id):
project = Project.objects.get(pk=project_id, workspace__slug=slug)
serializer = ModuleSerializer(data=request.data, context={"project_id": project_id, "workspace_id": project.workspace_id})
serializer = ModuleSerializer(
data=request.data,
context={
"project_id": project_id,
"workspace_id": project.workspace_id,
},
)
if serializer.is_valid():
serializer.save()
module = Module.objects.get(pk=serializer.data["id"])
@@ -131,8 +139,15 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, project_id, pk):
module = Module.objects.get(pk=pk, project_id=project_id, workspace__slug=slug)
serializer = ModuleSerializer(module, data=request.data, context={"project_id": project_id}, partial=True)
module = Module.objects.get(
pk=pk, project_id=project_id, workspace__slug=slug
)
serializer = ModuleSerializer(
module,
data=request.data,
context={"project_id": project_id},
partial=True,
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
@@ -162,9 +177,13 @@ class ModuleAPIEndpoint(WebhookMixin, BaseAPIView):
)
def delete(self, request, slug, project_id, pk):
module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
ModuleIssue.objects.filter(module_id=pk).values_list(
"issue", flat=True
)
)
issue_activity.delay(
type="module.activity.deleted",
@@ -204,7 +223,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
ModuleIssue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("issue")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -228,7 +249,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -250,7 +273,9 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -271,7 +296,8 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
issues = request.data.get("issues", [])
if not len(issues):
return Response(
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Issues are required"},
status=status.HTTP_400_BAD_REQUEST,
)
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=module_id
@@ -354,7 +380,10 @@ class ModuleIssueAPIEndpoint(WebhookMixin, BaseAPIView):
def delete(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, issue_id=issue_id
workspace__slug=slug,
project_id=project_id,
module_id=module_id,
issue_id=issue_id,
)
module_issue.delete()
issue_activity.delay(

View File

@@ -39,9 +39,15 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
def get_queryset(self):
return (
Project.objects.filter(workspace__slug=self.kwargs.get("slug"))
.filter(Q(project_projectmember__member=self.request.user) | Q(network=2))
.filter(
Q(project_projectmember__member=self.request.user)
| Q(network=2)
)
.select_related(
"workspace", "workspace__owner", "default_assignee", "project_lead"
"workspace",
"workspace__owner",
"default_assignee",
"project_lead",
)
.annotate(
is_member=Exists(
@@ -120,11 +126,18 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
request=request,
queryset=(projects),
on_results=lambda projects: ProjectSerializer(
projects, many=True, fields=self.fields, expand=self.expand,
projects,
many=True,
fields=self.fields,
expand=self.expand,
).data,
)
project = self.get_queryset().get(workspace__slug=slug, pk=project_id)
serializer = ProjectSerializer(project, fields=self.fields, expand=self.expand,)
serializer = ProjectSerializer(
project,
fields=self.fields,
expand=self.expand,
)
return Response(serializer.data, status=status.HTTP_200_OK)
def post(self, request, slug):
@@ -138,7 +151,9 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
# Add the user as Administrator to the project
project_member = ProjectMember.objects.create(
project_id=serializer.data["id"], member=request.user, role=20
project_id=serializer.data["id"],
member=request.user,
role=20,
)
# Also create the issue property for the user
_ = IssueProperty.objects.create(
@@ -211,9 +226,15 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
]
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
project = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors,
status=status.HTTP_400_BAD_REQUEST,
@@ -226,7 +247,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
)
except Workspace.DoesNotExist as e:
return Response(
{"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
{"error": "Workspace does not exist"},
status=status.HTTP_404_NOT_FOUND,
)
except ValidationError as e:
return Response(
@@ -250,7 +272,9 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
serializer.save()
if serializer.data["inbox_view"]:
Inbox.objects.get_or_create(
name=f"{project.name} Inbox", project=project, is_default=True
name=f"{project.name} Inbox",
project=project,
is_default=True,
)
# Create the triage state in Backlog group
@@ -262,10 +286,16 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
color="#ff7700",
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
project = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = ProjectSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError as e:
if "already exists" in str(e):
return Response(
@@ -274,7 +304,8 @@ class ProjectAPIEndpoint(WebhookMixin, BaseAPIView):
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
{"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
{"error": "Project does not exist"},
status=status.HTTP_404_NOT_FOUND,
)
except ValidationError as e:
return Response(

View File

@@ -34,7 +34,9 @@ class StateAPIEndpoint(BaseAPIView):
)
def post(self, request, slug, project_id):
serializer = StateSerializer(data=request.data, context={"project_id": project_id})
serializer = StateSerializer(
data=request.data, context={"project_id": project_id}
)
if serializer.is_valid():
serializer.save(project_id=project_id)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -64,14 +66,19 @@ class StateAPIEndpoint(BaseAPIView):
)
if state.default:
return Response({"error": "Default state cannot be deleted"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Default state cannot be deleted"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=state_id).exists()
if issue_exist:
return Response(
{"error": "The state is not empty, only empty states can be deleted"},
{
"error": "The state is not empty, only empty states can be deleted"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -79,7 +86,9 @@ class StateAPIEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, project_id, state_id=None):
state = State.objects.get(workspace__slug=slug, project_id=project_id, pk=state_id)
state = State.objects.get(
workspace__slug=slug, project_id=project_id, pk=state_id
)
serializer = StateSerializer(state, data=request.data, partial=True)
if serializer.is_valid():
serializer.save()

View File

@@ -25,7 +25,10 @@ class APIKeyAuthentication(authentication.BaseAuthentication):
def validate_api_token(self, token):
try:
api_token = APIToken.objects.get(
Q(Q(expired_at__gt=timezone.now()) | Q(expired_at__isnull=True)),
Q(
Q(expired_at__gt=timezone.now())
| Q(expired_at__isnull=True)
),
token=token,
is_active=True,
)

View File

@@ -1,4 +1,3 @@
from .workspace import (
WorkSpaceBasePermission,
WorkspaceOwnerPermission,
@@ -13,5 +12,3 @@ from .project import (
ProjectMemberPermission,
ProjectLitePermission,
)

View File

@@ -17,6 +17,7 @@ from .workspace import (
WorkspaceThemeSerializer,
WorkspaceMemberAdminSerializer,
WorkspaceMemberMeSerializer,
WorkspaceUserPropertiesSerializer,
)
from .project import (
ProjectSerializer,
@@ -31,14 +32,20 @@ from .project import (
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
ProjectPublicMemberSerializer,
ProjectMemberRoleSerializer,
)
from .state import StateSerializer, StateLiteSerializer
from .view import GlobalViewSerializer, IssueViewSerializer, IssueViewFavoriteSerializer
from .view import (
GlobalViewSerializer,
IssueViewSerializer,
IssueViewFavoriteSerializer,
)
from .cycle import (
CycleSerializer,
CycleIssueSerializer,
CycleFavoriteSerializer,
CycleWriteSerializer,
CycleUserPropertiesSerializer,
)
from .asset import FileAssetSerializer
from .issue import (
@@ -69,6 +76,7 @@ from .module import (
ModuleIssueSerializer,
ModuleLinkSerializer,
ModuleFavoriteSerializer,
ModuleUserPropertiesSerializer,
)
from .api import APITokenSerializer, APITokenReadSerializer
@@ -85,7 +93,12 @@ from .integration import (
from .importer import ImporterSerializer
from .page import PageSerializer, PageLogSerializer, SubPageSerializer, PageFavoriteSerializer
from .page import (
PageSerializer,
PageLogSerializer,
SubPageSerializer,
PageFavoriteSerializer,
)
from .estimate import (
EstimateSerializer,
@@ -93,7 +106,11 @@ from .estimate import (
EstimateReadSerializer,
)
from .inbox import InboxSerializer, InboxIssueSerializer, IssueStateInboxSerializer
from .inbox import (
InboxSerializer,
InboxIssueSerializer,
IssueStateInboxSerializer,
)
from .analytic import AnalyticViewSerializer
@@ -102,3 +119,5 @@ from .notification import NotificationSerializer
from .exporter import ExporterHistorySerializer
from .webhook import WebhookSerializer, WebhookLogSerializer
from .dashboard import DashboardSerializer, WidgetSerializer

View File

@@ -3,7 +3,6 @@ from plane.db.models import APIToken, APIActivityLog
class APITokenSerializer(BaseSerializer):
class Meta:
model = APIToken
fields = "__all__"
@@ -18,14 +17,12 @@ class APITokenSerializer(BaseSerializer):
class APITokenReadSerializer(BaseSerializer):
class Meta:
model = APIToken
exclude = ('token',)
exclude = ("token",)
class APIActivityLogSerializer(BaseSerializer):
class Meta:
model = APIActivityLog
fields = "__all__"

View File

@@ -4,16 +4,17 @@ from rest_framework import serializers
class BaseSerializer(serializers.ModelSerializer):
id = serializers.PrimaryKeyRelatedField(read_only=True)
class DynamicBaseSerializer(BaseSerializer):
class DynamicBaseSerializer(BaseSerializer):
def __init__(self, *args, **kwargs):
# If 'fields' is provided in the arguments, remove it and store it separately.
# This is done so as not to pass this custom argument up to the superclass.
fields = kwargs.pop("fields", None)
fields = kwargs.pop("fields", [])
self.expand = kwargs.pop("expand", []) or []
fields = self.expand
# Call the initialization of the superclass.
super().__init__(*args, **kwargs)
# If 'fields' was provided, filter the fields of the serializer accordingly.
if fields is not None:
self.fields = self._filter_fields(fields)
@@ -47,12 +48,97 @@ class DynamicBaseSerializer(BaseSerializer):
elif isinstance(item, dict):
allowed.append(list(item.keys())[0])
# Convert the current serializer's fields and the allowed fields to sets.
existing = set(self.fields)
allowed = set(allowed)
for field in allowed:
if field not in self.fields:
from . import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
LabelSerializer,
CycleIssueSerializer,
IssueFlatSerializer,
IssueRelationSerializer,
)
# Remove fields from the serializer that aren't in the 'allowed' list.
for field_name in (existing - allowed):
self.fields.pop(field_name)
# Expansion mapper
expansion = {
"user": UserLiteSerializer,
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"assignees": UserLiteSerializer,
"labels": LabelSerializer,
"issue_cycle": CycleIssueSerializer,
"parent": IssueSerializer,
"issue_relation": IssueRelationSerializer,
}
self.fields[field] = expansion[field](many=True if field in ["members", "assignees", "labels", "issue_cycle", "issue_relation"] else False)
return self.fields
def to_representation(self, instance):
response = super().to_representation(instance)
# Ensure 'expand' is iterable before processing
if self.expand:
for expand in self.expand:
if expand in self.fields:
# Import all the expandable serializers
from . import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
UserLiteSerializer,
StateLiteSerializer,
IssueSerializer,
LabelSerializer,
CycleIssueSerializer,
IssueRelationSerializer,
)
# Expansion mapper
expansion = {
"user": UserLiteSerializer,
"workspace": WorkspaceLiteSerializer,
"project": ProjectLiteSerializer,
"default_assignee": UserLiteSerializer,
"project_lead": UserLiteSerializer,
"state": StateLiteSerializer,
"created_by": UserLiteSerializer,
"issue": IssueSerializer,
"actor": UserLiteSerializer,
"owned_by": UserLiteSerializer,
"members": UserLiteSerializer,
"assignees": UserLiteSerializer,
"labels": LabelSerializer,
"issue_cycle": CycleIssueSerializer,
"parent": IssueSerializer,
"issue_relation": IssueRelationSerializer
}
# Check if field in expansion then expand the field
if expand in expansion:
if isinstance(response.get(expand), list):
exp_serializer = expansion[expand](
getattr(instance, expand), many=True
)
else:
exp_serializer = expansion[expand](
getattr(instance, expand)
)
response[expand] = exp_serializer.data
else:
# You might need to handle this case differently
response[expand] = getattr(
instance, f"{expand}_id", None
)
return response

View File

@@ -7,7 +7,12 @@ from .user import UserLiteSerializer
from .issue import IssueStateSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import Cycle, CycleIssue, CycleFavorite
from plane.db.models import (
Cycle,
CycleIssue,
CycleFavorite,
CycleUserProperties,
)
class CycleWriteSerializer(BaseSerializer):
@@ -17,7 +22,9 @@ class CycleWriteSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
raise serializers.ValidationError(
"Start date cannot exceed end date"
)
return data
class Meta:
@@ -38,7 +45,9 @@ class CycleSerializer(BaseSerializer):
total_estimates = serializers.IntegerField(read_only=True)
completed_estimates = serializers.IntegerField(read_only=True)
started_estimates = serializers.IntegerField(read_only=True)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
project_detail = ProjectLiteSerializer(read_only=True, source="project")
status = serializers.CharField(read_only=True)
@@ -48,7 +57,9 @@ class CycleSerializer(BaseSerializer):
and data.get("end_date", None) is not None
and data.get("start_date", None) > data.get("end_date", None)
):
raise serializers.ValidationError("Start date cannot exceed end date")
raise serializers.ValidationError(
"Start date cannot exceed end date"
)
return data
def get_assignees(self, obj):
@@ -106,3 +117,14 @@ class CycleFavoriteSerializer(BaseSerializer):
"project",
"user",
]
class CycleUserPropertiesSerializer(BaseSerializer):
class Meta:
model = CycleUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"cycle" "user",
]

View File

@@ -0,0 +1,26 @@
# Module imports
from .base import BaseSerializer
from plane.db.models import Dashboard, Widget
# Third party frameworks
from rest_framework import serializers
class DashboardSerializer(BaseSerializer):
class Meta:
model = Dashboard
fields = "__all__"
class WidgetSerializer(BaseSerializer):
is_visible = serializers.BooleanField(read_only=True)
widget_filters = serializers.JSONField(read_only=True)
class Meta:
model = Widget
fields = [
"id",
"key",
"is_visible",
"widget_filters"
]

View File

@@ -2,12 +2,18 @@
from .base import BaseSerializer
from plane.db.models import Estimate, EstimatePoint
from plane.app.serializers import WorkspaceLiteSerializer, ProjectLiteSerializer
from plane.app.serializers import (
WorkspaceLiteSerializer,
ProjectLiteSerializer,
)
from rest_framework import serializers
class EstimateSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:
@@ -20,13 +26,14 @@ class EstimateSerializer(BaseSerializer):
class EstimatePointSerializer(BaseSerializer):
def validate(self, data):
if not data:
raise serializers.ValidationError("Estimate points are required")
value = data.get("value")
if value and len(value) > 20:
raise serializers.ValidationError("Value can't be more than 20 characters")
raise serializers.ValidationError(
"Value can't be more than 20 characters"
)
return data
class Meta:
@@ -41,7 +48,9 @@ class EstimatePointSerializer(BaseSerializer):
class EstimateReadSerializer(BaseSerializer):
points = EstimatePointSerializer(read_only=True, many=True)
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
project_detail = ProjectLiteSerializer(read_only=True, source="project")
class Meta:

View File

@@ -5,7 +5,9 @@ from .user import UserLiteSerializer
class ExporterHistorySerializer(BaseSerializer):
initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
initiated_by_detail = UserLiteSerializer(
source="initiated_by", read_only=True
)
class Meta:
model = ExporterHistory

View File

@@ -7,9 +7,13 @@ from plane.db.models import Importer
class ImporterSerializer(BaseSerializer):
initiated_by_detail = UserLiteSerializer(source="initiated_by", read_only=True)
initiated_by_detail = UserLiteSerializer(
source="initiated_by", read_only=True
)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
class Meta:
model = Importer

View File

@@ -46,10 +46,13 @@ class InboxIssueLiteSerializer(BaseSerializer):
class IssueStateInboxSerializer(BaseSerializer):
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
label_details = LabelLiteSerializer(
read_only=True, source="labels", many=True
)
assignee_details = UserLiteSerializer(
read_only=True, source="assignees", many=True
)
sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
issue_inbox = InboxIssueLiteSerializer(read_only=True, many=True)
class Meta:

View File

@@ -13,7 +13,9 @@ class IntegrationSerializer(BaseSerializer):
class WorkspaceIntegrationSerializer(BaseSerializer):
integration_detail = IntegrationSerializer(read_only=True, source="integration")
integration_detail = IntegrationSerializer(
read_only=True, source="integration"
)
class Meta:
model = WorkspaceIntegration

View File

@@ -30,6 +30,8 @@ from plane.db.models import (
CommentReaction,
IssueVote,
IssueRelation,
State,
Project,
)
@@ -69,19 +71,26 @@ class IssueProjectLiteSerializer(BaseSerializer):
##TODO: Find a better way to write this serializer
## Find a better approach to save manytomany?
class IssueCreateSerializer(BaseSerializer):
state_detail = StateSerializer(read_only=True, source="state")
created_by_detail = UserLiteSerializer(read_only=True, source="created_by")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
assignees = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
# ids
state_id = serializers.PrimaryKeyRelatedField(
source="state",
queryset=State.objects.all(),
required=False,
allow_null=True,
)
parent_id = serializers.PrimaryKeyRelatedField(
source="parent",
queryset=Issue.objects.all(),
required=False,
allow_null=True,
)
label_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
assignee_ids = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
required=False,
)
@@ -100,8 +109,10 @@ class IssueCreateSerializer(BaseSerializer):
def to_representation(self, instance):
data = super().to_representation(instance)
data['assignees'] = [str(assignee.id) for assignee in instance.assignees.all()]
data['labels'] = [str(label.id) for label in instance.labels.all()]
assignee_ids = self.initial_data.get("assignee_ids")
data["assignee_ids"] = assignee_ids if assignee_ids else []
label_ids = self.initial_data.get("label_ids")
data["label_ids"] = label_ids if label_ids else []
return data
def validate(self, data):
@@ -110,12 +121,14 @@ class IssueCreateSerializer(BaseSerializer):
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError("Start date cannot exceed target date")
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
return data
def create(self, validated_data):
assignees = validated_data.pop("assignees", None)
labels = validated_data.pop("labels", None)
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
project_id = self.context["project_id"]
workspace_id = self.context["workspace_id"]
@@ -173,8 +186,8 @@ class IssueCreateSerializer(BaseSerializer):
return issue
def update(self, instance, validated_data):
assignees = validated_data.pop("assignees", None)
labels = validated_data.pop("labels", None)
assignees = validated_data.pop("assignee_ids", None)
labels = validated_data.pop("label_ids", None)
# Related models
project_id = instance.project_id
@@ -225,14 +238,15 @@ class IssueActivitySerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
class Meta:
model = IssueActivity
fields = "__all__"
class IssuePropertySerializer(BaseSerializer):
class Meta:
model = IssueProperty
@@ -245,7 +259,9 @@ class IssuePropertySerializer(BaseSerializer):
class LabelSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
class Meta:
@@ -268,7 +284,6 @@ class LabelLiteSerializer(BaseSerializer):
class IssueLabelSerializer(BaseSerializer):
class Meta:
model = IssueLabel
fields = "__all__"
@@ -279,33 +294,38 @@ class IssueLabelSerializer(BaseSerializer):
class IssueRelationSerializer(BaseSerializer):
issue_detail = IssueProjectLiteSerializer(read_only=True, source="related_issue")
id = serializers.UUIDField(source="related_issue.id", read_only=True)
project_id = serializers.PrimaryKeyRelatedField(source="related_issue.project_id", read_only=True)
sequence_id = serializers.IntegerField(source="related_issue.sequence_id", read_only=True)
relation_type = serializers.CharField(read_only=True)
class Meta:
model = IssueRelation
fields = [
"issue_detail",
"id",
"project_id",
"sequence_id",
"relation_type",
"related_issue",
"issue",
"id"
]
read_only_fields = [
"workspace",
"project",
]
class RelatedIssueSerializer(BaseSerializer):
issue_detail = IssueProjectLiteSerializer(read_only=True, source="issue")
id = serializers.UUIDField(source="issue.id", read_only=True)
project_id = serializers.PrimaryKeyRelatedField(source="issue.project_id", read_only=True)
sequence_id = serializers.IntegerField(source="issue.sequence_id", read_only=True)
relation_type = serializers.CharField(read_only=True)
class Meta:
model = IssueRelation
fields = [
"issue_detail",
"id",
"project_id",
"sequence_id",
"relation_type",
"related_issue",
"issue",
"id"
]
read_only_fields = [
"workspace",
@@ -400,7 +420,8 @@ class IssueLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"), issue_id=validated_data.get("issue_id")
url=validated_data.get("url"),
issue_id=validated_data.get("issue_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -424,7 +445,6 @@ class IssueAttachmentSerializer(BaseSerializer):
class IssueReactionSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
@@ -459,12 +479,18 @@ class CommentReactionSerializer(BaseSerializer):
class IssueVoteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueVote
fields = ["issue", "vote", "workspace", "project", "actor", "actor_detail"]
fields = [
"issue",
"vote",
"workspace",
"project",
"actor",
"actor_detail",
]
read_only_fields = fields
@@ -472,8 +498,12 @@ class IssueCommentSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
issue_detail = IssueFlatSerializer(read_only=True, source="issue")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
comment_reactions = CommentReactionLiteSerializer(read_only=True, many=True)
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
comment_reactions = CommentReactionLiteSerializer(
read_only=True, many=True
)
is_member = serializers.BooleanField(read_only=True)
class Meta:
@@ -507,12 +537,15 @@ class IssueStateFlatSerializer(BaseSerializer):
# Issue Serializer with state details
class IssueStateSerializer(DynamicBaseSerializer):
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
label_details = LabelLiteSerializer(
read_only=True, source="labels", many=True
)
state_detail = StateLiteSerializer(read_only=True, source="state")
project_detail = ProjectLiteSerializer(read_only=True, source="project")
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
assignee_details = UserLiteSerializer(
read_only=True, source="assignees", many=True
)
sub_issues_count = serializers.IntegerField(read_only=True)
bridge_id = serializers.UUIDField(read_only=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
@@ -521,40 +554,76 @@ class IssueStateSerializer(DynamicBaseSerializer):
fields = "__all__"
class IssueSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateSerializer(read_only=True, source="state")
parent_detail = IssueStateFlatSerializer(read_only=True, source="parent")
label_details = LabelSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
related_issues = IssueRelationSerializer(read_only=True, source="issue_relation", many=True)
issue_relations = RelatedIssueSerializer(read_only=True, source="issue_related", many=True)
issue_cycle = IssueCycleDetailSerializer(read_only=True)
issue_module = IssueModuleDetailSerializer(read_only=True)
issue_link = IssueLinkSerializer(read_only=True, many=True)
issue_attachment = IssueAttachmentSerializer(read_only=True, many=True)
class IssueSerializer(DynamicBaseSerializer):
# ids
project_id = serializers.PrimaryKeyRelatedField(read_only=True)
state_id = serializers.PrimaryKeyRelatedField(read_only=True)
parent_id = serializers.PrimaryKeyRelatedField(read_only=True)
cycle_id = serializers.PrimaryKeyRelatedField(read_only=True)
module_id = serializers.PrimaryKeyRelatedField(read_only=True)
# Many to many
label_ids = serializers.PrimaryKeyRelatedField(
read_only=True, many=True, source="labels"
)
assignee_ids = serializers.PrimaryKeyRelatedField(
read_only=True, many=True, source="assignees"
)
# Count items
sub_issues_count = serializers.IntegerField(read_only=True)
issue_reactions = IssueReactionSerializer(read_only=True, many=True)
attachment_count = serializers.IntegerField(read_only=True)
link_count = serializers.IntegerField(read_only=True)
# is_subscribed
is_subscribed = serializers.BooleanField(read_only=True)
class Meta:
model = Issue
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"created_by",
"updated_by",
fields = [
"id",
"name",
"state_id",
"description_html",
"sort_order",
"completed_at",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"module_id",
"label_ids",
"assignee_ids",
"sub_issues_count",
"created_at",
"updated_at",
"created_by",
"updated_by",
"attachment_count",
"link_count",
"is_subscribed",
"is_draft",
"archived_at",
]
read_only_fields = fields
class IssueLiteSerializer(DynamicBaseSerializer):
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
assignee_details = UserLiteSerializer(read_only=True, source="assignees", many=True)
label_details = LabelLiteSerializer(
read_only=True, source="labels", many=True
)
assignee_details = UserLiteSerializer(
read_only=True, source="assignees", many=True
)
sub_issues_count = serializers.IntegerField(read_only=True)
cycle_id = serializers.UUIDField(read_only=True)
module_id = serializers.UUIDField(read_only=True)
@@ -581,7 +650,9 @@ class IssueLiteSerializer(DynamicBaseSerializer):
class IssuePublicSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
reactions = IssueReactionSerializer(read_only=True, many=True, source="issue_reactions")
reactions = IssueReactionSerializer(
read_only=True, many=True, source="issue_reactions"
)
votes = IssueVoteSerializer(read_only=True, many=True)
class Meta:
@@ -604,7 +675,6 @@ class IssuePublicSerializer(BaseSerializer):
read_only_fields = fields
class IssueSubscriberSerializer(BaseSerializer):
class Meta:
model = IssueSubscriber

View File

@@ -2,7 +2,7 @@
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer
from .project import ProjectLiteSerializer
from .workspace import WorkspaceLiteSerializer
@@ -14,6 +14,7 @@ from plane.db.models import (
ModuleIssue,
ModuleLink,
ModuleFavorite,
ModuleUserProperties,
)
@@ -25,7 +26,9 @@ class ModuleWriteSerializer(BaseSerializer):
)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
class Meta:
model = Module
@@ -41,12 +44,18 @@ class ModuleWriteSerializer(BaseSerializer):
def to_representation(self, instance):
data = super().to_representation(instance)
data['members'] = [str(member.id) for member in instance.members.all()]
data["members"] = [str(member.id) for member in instance.members.all()]
return data
def validate(self, data):
if data.get("start_date", None) is not None and data.get("target_date", None) is not None and data.get("start_date", None) > data.get("target_date", None):
raise serializers.ValidationError("Start date cannot exceed target date")
if (
data.get("start_date", None) is not None
and data.get("target_date", None) is not None
and data.get("start_date", None) > data.get("target_date", None)
):
raise serializers.ValidationError(
"Start date cannot exceed target date"
)
return data
def create(self, validated_data):
@@ -151,7 +160,8 @@ class ModuleLinkSerializer(BaseSerializer):
# Validation if url already exists
def create(self, validated_data):
if ModuleLink.objects.filter(
url=validated_data.get("url"), module_id=validated_data.get("module_id")
url=validated_data.get("url"),
module_id=validated_data.get("module_id"),
).exists():
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
@@ -159,10 +169,12 @@ class ModuleLinkSerializer(BaseSerializer):
return ModuleLink.objects.create(**validated_data)
class ModuleSerializer(BaseSerializer):
class ModuleSerializer(DynamicBaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
lead_detail = UserLiteSerializer(read_only=True, source="lead")
members_detail = UserLiteSerializer(read_only=True, many=True, source="members")
members_detail = UserLiteSerializer(
read_only=True, many=True, source="members"
)
link_module = ModuleLinkSerializer(read_only=True, many=True)
is_favorite = serializers.BooleanField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
@@ -196,3 +208,10 @@ class ModuleFavoriteSerializer(BaseSerializer):
"project",
"user",
]
class ModuleUserPropertiesSerializer(BaseSerializer):
class Meta:
model = ModuleUserProperties
fields = "__all__"
read_only_fields = ["workspace", "project", "module", "user"]

View File

@@ -3,10 +3,12 @@ from .base import BaseSerializer
from .user import UserLiteSerializer
from plane.db.models import Notification
class NotificationSerializer(BaseSerializer):
triggered_by_details = UserLiteSerializer(read_only=True, source="triggered_by")
triggered_by_details = UserLiteSerializer(
read_only=True, source="triggered_by"
)
class Meta:
model = Notification
fields = "__all__"

View File

@@ -6,19 +6,31 @@ from .base import BaseSerializer
from .issue import IssueFlatSerializer, LabelLiteSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import Page, PageLog, PageFavorite, PageLabel, Label, Issue, Module
from plane.db.models import (
Page,
PageLog,
PageFavorite,
PageLabel,
Label,
Issue,
Module,
)
class PageSerializer(BaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
label_details = LabelLiteSerializer(read_only=True, source="labels", many=True)
label_details = LabelLiteSerializer(
read_only=True, source="labels", many=True
)
labels = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=Label.objects.all()),
write_only=True,
required=False,
)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
class Meta:
model = Page
@@ -28,9 +40,10 @@ class PageSerializer(BaseSerializer):
"project",
"owned_by",
]
def to_representation(self, instance):
data = super().to_representation(instance)
data['labels'] = [str(label.id) for label in instance.labels.all()]
data["labels"] = [str(label.id) for label in instance.labels.all()]
return data
def create(self, validated_data):
@@ -94,7 +107,7 @@ class SubPageSerializer(BaseSerializer):
def get_entity_details(self, obj):
entity_name = obj.entity_name
if entity_name == 'forward_link' or entity_name == 'back_link':
if entity_name == "forward_link" or entity_name == "back_link":
try:
page = Page.objects.get(pk=obj.entity_identifier)
return PageSerializer(page).data
@@ -104,7 +117,6 @@ class SubPageSerializer(BaseSerializer):
class PageLogSerializer(BaseSerializer):
class Meta:
model = PageLog
fields = "__all__"

View File

@@ -4,7 +4,10 @@ from rest_framework import serializers
# Module imports
from .base import BaseSerializer, DynamicBaseSerializer
from plane.app.serializers.workspace import WorkspaceLiteSerializer
from plane.app.serializers.user import UserLiteSerializer, UserAdminLiteSerializer
from plane.app.serializers.user import (
UserLiteSerializer,
UserAdminLiteSerializer,
)
from plane.db.models import (
Project,
ProjectMember,
@@ -17,7 +20,9 @@ from plane.db.models import (
class ProjectSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
class Meta:
model = Project
@@ -29,12 +34,16 @@ class ProjectSerializer(BaseSerializer):
def create(self, validated_data):
identifier = validated_data.get("identifier", "").strip().upper()
if identifier == "":
raise serializers.ValidationError(detail="Project Identifier is required")
raise serializers.ValidationError(
detail="Project Identifier is required"
)
if ProjectIdentifier.objects.filter(
name=identifier, workspace_id=self.context["workspace_id"]
).exists():
raise serializers.ValidationError(detail="Project Identifier is taken")
raise serializers.ValidationError(
detail="Project Identifier is taken"
)
project = Project.objects.create(
**validated_data, workspace_id=self.context["workspace_id"]
)
@@ -73,7 +82,9 @@ class ProjectSerializer(BaseSerializer):
return project
# If not same fail update
raise serializers.ValidationError(detail="Project Identifier is already taken")
raise serializers.ValidationError(
detail="Project Identifier is already taken"
)
class ProjectLiteSerializer(BaseSerializer):
@@ -160,6 +171,12 @@ class ProjectMemberAdminSerializer(BaseSerializer):
fields = "__all__"
class ProjectMemberRoleSerializer(DynamicBaseSerializer):
class Meta:
model = ProjectMember
fields = ("id", "role", "member", "project")
class ProjectMemberInviteSerializer(BaseSerializer):
project = ProjectLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
@@ -197,7 +214,9 @@ class ProjectMemberLiteSerializer(BaseSerializer):
class ProjectDeployBoardSerializer(BaseSerializer):
project_details = ProjectLiteSerializer(read_only=True, source="project")
workspace_detail = WorkspaceLiteSerializer(read_only=True, source="workspace")
workspace_detail = WorkspaceLiteSerializer(
read_only=True, source="workspace"
)
class Meta:
model = ProjectDeployBoard

View File

@@ -6,7 +6,6 @@ from plane.db.models import State
class StateSerializer(BaseSerializer):
class Meta:
model = State
fields = "__all__"

View File

@@ -99,7 +99,9 @@ class UserMeSettingsSerializer(BaseSerializer):
).first()
return {
"last_workspace_id": obj.last_workspace_id,
"last_workspace_slug": workspace.slug if workspace is not None else "",
"last_workspace_slug": workspace.slug
if workspace is not None
else "",
"fallback_workspace_id": obj.last_workspace_id,
"fallback_workspace_slug": workspace.slug
if workspace is not None
@@ -109,7 +111,8 @@ class UserMeSettingsSerializer(BaseSerializer):
else:
fallback_workspace = (
Workspace.objects.filter(
workspace_member__member_id=obj.id, workspace_member__is_active=True
workspace_member__member_id=obj.id,
workspace_member__is_active=True,
)
.order_by("created_at")
.first()
@@ -180,7 +183,9 @@ class ChangePasswordSerializer(serializers.Serializer):
if data.get("new_password") != data.get("confirm_password"):
raise serializers.ValidationError(
{"error": "Confirm password should be same as the new password."}
{
"error": "Confirm password should be same as the new password."
}
)
return data
@@ -190,4 +195,5 @@ class ResetPasswordSerializer(serializers.Serializer):
"""
Serializer for password change endpoint.
"""
new_password = serializers.CharField(required=True, min_length=8)

View File

@@ -2,7 +2,7 @@
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .base import BaseSerializer, DynamicBaseSerializer
from .workspace import WorkspaceLiteSerializer
from .project import ProjectLiteSerializer
from plane.db.models import GlobalView, IssueView, IssueViewFavorite
@@ -10,7 +10,9 @@ from plane.utils.issue_filters import issue_filters
class GlobalViewSerializer(BaseSerializer):
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
class Meta:
model = GlobalView
@@ -38,10 +40,12 @@ class GlobalViewSerializer(BaseSerializer):
return super().update(instance, validated_data)
class IssueViewSerializer(BaseSerializer):
class IssueViewSerializer(DynamicBaseSerializer):
is_favorite = serializers.BooleanField(read_only=True)
project_detail = ProjectLiteSerializer(source="project", read_only=True)
workspace_detail = WorkspaceLiteSerializer(source="workspace", read_only=True)
workspace_detail = WorkspaceLiteSerializer(
source="workspace", read_only=True
)
class Meta:
model = IssueView

View File

@@ -12,6 +12,7 @@ from .base import DynamicBaseSerializer
from plane.db.models import Webhook, WebhookLog
from plane.db.models.webhook import validate_domain, validate_schema
class WebhookSerializer(DynamicBaseSerializer):
url = serializers.URLField(validators=[validate_schema, validate_domain])
@@ -21,32 +22,49 @@ class WebhookSerializer(DynamicBaseSerializer):
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
raise serializers.ValidationError(
{"url": "Invalid URL: No hostname found."}
)
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
raise serializers.ValidationError(
{"url": "Hostname could not be resolved."}
)
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
raise serializers.ValidationError(
{"url": "No IP addresses found for the hostname."}
)
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
raise serializers.ValidationError(
{"url": "URL resolves to a blocked IP address."}
)
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
request = self.context.get("request")
disallowed_domains = [
"plane.so",
] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
request_host = request.get_host().split(":")[
0
] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
if any(
hostname == domain or hostname.endswith("." + domain)
for domain in disallowed_domains
):
raise serializers.ValidationError(
{"url": "URL domain or its subdomain is not allowed."}
)
return Webhook.objects.create(**validated_data)
@@ -56,32 +74,49 @@ class WebhookSerializer(DynamicBaseSerializer):
# Extract the hostname from the URL
hostname = urlparse(url).hostname
if not hostname:
raise serializers.ValidationError({"url": "Invalid URL: No hostname found."})
raise serializers.ValidationError(
{"url": "Invalid URL: No hostname found."}
)
# Resolve the hostname to IP addresses
try:
ip_addresses = socket.getaddrinfo(hostname, None)
except socket.gaierror:
raise serializers.ValidationError({"url": "Hostname could not be resolved."})
raise serializers.ValidationError(
{"url": "Hostname could not be resolved."}
)
if not ip_addresses:
raise serializers.ValidationError({"url": "No IP addresses found for the hostname."})
raise serializers.ValidationError(
{"url": "No IP addresses found for the hostname."}
)
for addr in ip_addresses:
ip = ipaddress.ip_address(addr[4][0])
if ip.is_private or ip.is_loopback:
raise serializers.ValidationError({"url": "URL resolves to a blocked IP address."})
raise serializers.ValidationError(
{"url": "URL resolves to a blocked IP address."}
)
# Additional validation for multiple request domains and their subdomains
request = self.context.get('request')
disallowed_domains = ['plane.so',] # Add your disallowed domains here
request = self.context.get("request")
disallowed_domains = [
"plane.so",
] # Add your disallowed domains here
if request:
request_host = request.get_host().split(':')[0] # Remove port if present
request_host = request.get_host().split(":")[
0
] # Remove port if present
disallowed_domains.append(request_host)
# Check if hostname is a subdomain or exact match of any disallowed domain
if any(hostname == domain or hostname.endswith('.' + domain) for domain in disallowed_domains):
raise serializers.ValidationError({"url": "URL domain or its subdomain is not allowed."})
if any(
hostname == domain or hostname.endswith("." + domain)
for domain in disallowed_domains
):
raise serializers.ValidationError(
{"url": "URL domain or its subdomain is not allowed."}
)
return super().update(instance, validated_data)
@@ -95,12 +130,7 @@ class WebhookSerializer(DynamicBaseSerializer):
class WebhookLogSerializer(DynamicBaseSerializer):
class Meta:
model = WebhookLog
fields = "__all__"
read_only_fields = [
"workspace",
"webhook"
]
read_only_fields = ["workspace", "webhook"]

View File

@@ -2,7 +2,7 @@
from rest_framework import serializers
# Module imports
from .base import BaseSerializer
from .base import BaseSerializer, DynamicBaseSerializer
from .user import UserLiteSerializer, UserAdminLiteSerializer
from plane.db.models import (
@@ -13,10 +13,11 @@ from plane.db.models import (
TeamMember,
WorkspaceMemberInvite,
WorkspaceTheme,
WorkspaceUserProperties,
)
class WorkSpaceSerializer(BaseSerializer):
class WorkSpaceSerializer(DynamicBaseSerializer):
owner = UserLiteSerializer(read_only=True)
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
@@ -50,6 +51,7 @@ class WorkSpaceSerializer(BaseSerializer):
"owner",
]
class WorkspaceLiteSerializer(BaseSerializer):
class Meta:
model = Workspace
@@ -61,8 +63,7 @@ class WorkspaceLiteSerializer(BaseSerializer):
read_only_fields = fields
class WorkSpaceMemberSerializer(BaseSerializer):
class WorkSpaceMemberSerializer(DynamicBaseSerializer):
member = UserLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
@@ -72,13 +73,12 @@ class WorkSpaceMemberSerializer(BaseSerializer):
class WorkspaceMemberMeSerializer(BaseSerializer):
class Meta:
model = WorkspaceMember
fields = "__all__"
class WorkspaceMemberAdminSerializer(BaseSerializer):
class WorkspaceMemberAdminSerializer(DynamicBaseSerializer):
member = UserAdminLiteSerializer(read_only=True)
workspace = WorkspaceLiteSerializer(read_only=True)
@@ -108,7 +108,9 @@ class WorkSpaceMemberInviteSerializer(BaseSerializer):
class TeamSerializer(BaseSerializer):
members_detail = UserLiteSerializer(read_only=True, source="members", many=True)
members_detail = UserLiteSerializer(
read_only=True, source="members", many=True
)
members = serializers.ListField(
child=serializers.PrimaryKeyRelatedField(queryset=User.objects.all()),
write_only=True,
@@ -145,7 +147,9 @@ class TeamSerializer(BaseSerializer):
members = validated_data.pop("members")
TeamMember.objects.filter(team=instance).delete()
team_members = [
TeamMember(member=member, team=instance, workspace=instance.workspace)
TeamMember(
member=member, team=instance, workspace=instance.workspace
)
for member in members
]
TeamMember.objects.bulk_create(team_members, batch_size=10)
@@ -161,3 +165,13 @@ class WorkspaceThemeSerializer(BaseSerializer):
"workspace",
"actor",
]
class WorkspaceUserPropertiesSerializer(BaseSerializer):
class Meta:
model = WorkspaceUserProperties
fields = "__all__"
read_only_fields = [
"workspace",
"user",
]

View File

@@ -3,6 +3,7 @@ from .asset import urlpatterns as asset_urls
from .authentication import urlpatterns as authentication_urls
from .config import urlpatterns as configuration_urls
from .cycle import urlpatterns as cycle_urls
from .dashboard import urlpatterns as dashboard_urls
from .estimate import urlpatterns as estimate_urls
from .external import urlpatterns as external_urls
from .importer import urlpatterns as importer_urls
@@ -28,6 +29,7 @@ urlpatterns = [
*authentication_urls,
*configuration_urls,
*cycle_urls,
*dashboard_urls,
*estimate_urls,
*external_urls,
*importer_urls,

View File

@@ -31,8 +31,14 @@ urlpatterns = [
path("sign-in/", SignInEndpoint.as_view(), name="sign-in"),
path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
# magic sign in
path("magic-generate/", MagicGenerateEndpoint.as_view(), name="magic-generate"),
path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
path(
"magic-generate/",
MagicGenerateEndpoint.as_view(),
name="magic-generate",
),
path(
"magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"
),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
# Password Manipulation
path(
@@ -52,6 +58,8 @@ urlpatterns = [
),
# API Tokens
path("api-tokens/", ApiTokenEndpoint.as_view(), name="api-tokens"),
path("api-tokens/<uuid:pk>/", ApiTokenEndpoint.as_view(), name="api-tokens"),
path(
"api-tokens/<uuid:pk>/", ApiTokenEndpoint.as_view(), name="api-tokens"
),
## End API Tokens
]

View File

@@ -1,7 +1,7 @@
from django.urls import path
from plane.app.views import ConfigurationEndpoint
from plane.app.views import ConfigurationEndpoint, MobileConfigurationEndpoint
urlpatterns = [
path(
@@ -9,4 +9,9 @@ urlpatterns = [
ConfigurationEndpoint.as_view(),
name="configuration",
),
path(
"mobile-configs/",
MobileConfigurationEndpoint.as_view(),
name="configuration",
),
]

View File

@@ -7,10 +7,17 @@ from plane.app.views import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
CycleUserPropertiesEndpoint,
ActiveCycleEndpoint
)
urlpatterns = [
path(
"workspaces/<str:slug>/active-cycles/",
ActiveCycleEndpoint.as_view(),
name="workspace-active-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/",
CycleViewSet.as_view(
@@ -44,7 +51,7 @@ urlpatterns = [
name="project-issue-cycle",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/cycle-issues/<uuid:issue_id>/",
CycleIssueViewSet.as_view(
{
"get": "retrieve",
@@ -84,4 +91,9 @@ urlpatterns = [
TransferCycleIssueEndpoint.as_view(),
name="transfer-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/cycles/<uuid:cycle_id>/user-properties/",
CycleUserPropertiesEndpoint.as_view(),
name="cycle-user-filters",
),
]

View File

@@ -0,0 +1,23 @@
from django.urls import path
from plane.app.views import DashboardEndpoint, WidgetsEndpoint
urlpatterns = [
path(
"workspaces/<str:slug>/dashboard/",
DashboardEndpoint.as_view(),
name="dashboard",
),
path(
"workspaces/<str:slug>/dashboard/<uuid:dashboard_id>/",
DashboardEndpoint.as_view(),
name="dashboard",
),
path(
"dashboard/<uuid:dashboard_id>/widgets/<uuid:widget_id>/",
WidgetsEndpoint.as_view(),
name="widgets",
),
]

View File

@@ -40,7 +40,7 @@ urlpatterns = [
name="inbox-issue",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/inboxes/<uuid:inbox_id>/inbox-issues/<uuid:issue_id>/",
InboxIssueViewSet.as_view(
{
"get": "retrieve",

View File

@@ -235,7 +235,7 @@ urlpatterns = [
## End Comment Reactions
## IssueProperty
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issue-display-properties/",
"workspaces/<str:slug>/projects/<uuid:project_id>/user-properties/",
IssueUserDisplayPropertyEndpoint.as_view(),
name="project-issue-display-properties",
),
@@ -275,16 +275,17 @@ urlpatterns = [
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/",
IssueRelationViewSet.as_view(
{
"get": "list",
"post": "create",
}
),
name="issue-relation",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/issue-relation/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/issues/<uuid:issue_id>/remove-relation/",
IssueRelationViewSet.as_view(
{
"delete": "destroy",
"post": "remove_relation",
}
),
name="issue-relation",

View File

@@ -7,6 +7,7 @@ from plane.app.views import (
ModuleLinkViewSet,
ModuleFavoriteViewSet,
BulkImportModulesEndpoint,
ModuleUserPropertiesEndpoint,
)
@@ -44,7 +45,7 @@ urlpatterns = [
name="project-module-issues",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:pk>/",
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/module-issues/<uuid:issue_id>/",
ModuleIssueViewSet.as_view(
{
"get": "retrieve",
@@ -101,4 +102,9 @@ urlpatterns = [
BulkImportModulesEndpoint.as_view(),
name="bulk-modules-create",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/modules/<uuid:module_id>/user-properties/",
ModuleUserPropertiesEndpoint.as_view(),
name="cycle-user-filters",
),
]

View File

@@ -18,6 +18,8 @@ from plane.app.views import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
WorkspaceProjectMemberEndpoint,
WorkspaceUserPropertiesEndpoint,
)
@@ -92,6 +94,11 @@ urlpatterns = [
WorkSpaceMemberViewSet.as_view({"get": "list"}),
name="workspace-member",
),
path(
"workspaces/<str:slug>/project-members/",
WorkspaceProjectMemberEndpoint.as_view(),
name="workspace-member-roles",
),
path(
"workspaces/<str:slug>/members/<uuid:pk>/",
WorkSpaceMemberViewSet.as_view(
@@ -195,4 +202,9 @@ urlpatterns = [
WorkspaceLabelsEndpoint.as_view(),
name="workspace-labels",
),
path(
"workspaces/<str:slug>/user-properties/",
WorkspaceUserPropertiesEndpoint.as_view(),
name="workspace-user-filters",
),
]

View File

@@ -192,7 +192,7 @@ from plane.app.views import (
)
#TODO: Delete this file
# TODO: Delete this file
# This url file has been deprecated use apiserver/plane/urls folder to create new urls
urlpatterns = [
@@ -204,10 +204,14 @@ urlpatterns = [
path("sign-out/", SignOutEndpoint.as_view(), name="sign-out"),
# Magic Sign In/Up
path(
"magic-generate/", MagicSignInGenerateEndpoint.as_view(), name="magic-generate"
"magic-generate/",
MagicSignInGenerateEndpoint.as_view(),
name="magic-generate",
),
path("magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"),
path('token/refresh/', TokenRefreshView.as_view(), name='token_refresh'),
path(
"magic-sign-in/", MagicSignInEndpoint.as_view(), name="magic-sign-in"
),
path("token/refresh/", TokenRefreshView.as_view(), name="token_refresh"),
# Email verification
path("email-verify/", VerifyEmailEndpoint.as_view(), name="email-verify"),
path(
@@ -272,7 +276,9 @@ urlpatterns = [
# user workspace invitations
path(
"users/me/invitations/workspaces/",
UserWorkspaceInvitationsEndpoint.as_view({"get": "list", "post": "create"}),
UserWorkspaceInvitationsEndpoint.as_view(
{"get": "list", "post": "create"}
),
name="user-workspace-invitations",
),
# user workspace invitation
@@ -311,7 +317,9 @@ urlpatterns = [
# user project invitations
path(
"users/me/invitations/projects/",
UserProjectInvitationsViewset.as_view({"get": "list", "post": "create"}),
UserProjectInvitationsViewset.as_view(
{"get": "list", "post": "create"}
),
name="user-project-invitaions",
),
## Workspaces ##
@@ -1238,7 +1246,7 @@ urlpatterns = [
"post": "unarchive",
}
),
name="project-page-unarchive"
name="project-page-unarchive",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-pages/",
@@ -1264,19 +1272,22 @@ urlpatterns = [
{
"post": "unlock",
}
)
),
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/",
PageLogEndpoint.as_view(), name="page-transactions"
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/transactions/<uuid:transaction>/",
PageLogEndpoint.as_view(), name="page-transactions"
PageLogEndpoint.as_view(),
name="page-transactions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/sub-pages/",
SubPagesEndpoint.as_view(), name="sub-page"
SubPagesEndpoint.as_view(),
name="sub-page",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/estimates/",
@@ -1326,7 +1337,9 @@ urlpatterns = [
## End Pages
# API Tokens
path("api-tokens/", ApiTokenEndpoint.as_view(), name="api-tokens"),
path("api-tokens/<uuid:pk>/", ApiTokenEndpoint.as_view(), name="api-tokens"),
path(
"api-tokens/<uuid:pk>/", ApiTokenEndpoint.as_view(), name="api-tokens"
),
## End API Tokens
# Integrations
path(

View File

@@ -45,6 +45,8 @@ from .workspace import (
WorkspaceUserProfileEndpoint,
WorkspaceUserProfileIssuesEndpoint,
WorkspaceLabelsEndpoint,
WorkspaceProjectMemberEndpoint,
WorkspaceUserPropertiesEndpoint,
)
from .state import StateViewSet
from .view import (
@@ -59,6 +61,8 @@ from .cycle import (
CycleDateCheckEndpoint,
CycleFavoriteViewSet,
TransferCycleIssueEndpoint,
CycleUserPropertiesEndpoint,
ActiveCycleEndpoint,
)
from .asset import FileAssetEndpoint, UserAssetsEndpoint, FileAssetViewSet
from .issue import (
@@ -103,6 +107,7 @@ from .module import (
ModuleIssueViewSet,
ModuleLinkViewSet,
ModuleFavoriteViewSet,
ModuleUserPropertiesEndpoint,
)
from .api import ApiTokenEndpoint
@@ -136,7 +141,11 @@ from .page import (
from .search import GlobalSearchEndpoint, IssueSearchEndpoint
from .external import GPTIntegrationEndpoint, ReleaseNotesEndpoint, UnsplashEndpoint
from .external import (
GPTIntegrationEndpoint,
ReleaseNotesEndpoint,
UnsplashEndpoint,
)
from .estimate import (
ProjectEstimatePointEndpoint,
@@ -161,10 +170,15 @@ from .notification import (
from .exporter import ExportIssuesEndpoint
from .config import ConfigurationEndpoint
from .config import ConfigurationEndpoint, MobileConfigurationEndpoint
from .webhook import (
WebhookEndpoint,
WebhookLogsEndpoint,
WebhookSecretRegenerateEndpoint,
)
from .dashboard import (
DashboardEndpoint,
WidgetsEndpoint
)

View File

@@ -61,7 +61,9 @@ class AnalyticsEndpoint(BaseAPIView):
)
# If segment is present it cannot be same as x-axis
if segment and (segment not in valid_xaxis_segment or x_axis == segment):
if segment and (
segment not in valid_xaxis_segment or x_axis == segment
):
return Response(
{
"error": "Both segment and x axis cannot be same and segment should be valid"
@@ -110,7 +112,9 @@ class AnalyticsEndpoint(BaseAPIView):
if x_axis in ["assignees__id"] or segment in ["assignees__id"]:
assignee_details = (
Issue.issue_objects.filter(
workspace__slug=slug, **filters, assignees__avatar__isnull=False
workspace__slug=slug,
**filters,
assignees__avatar__isnull=False,
)
.order_by("assignees__id")
.distinct("assignees__id")
@@ -124,7 +128,9 @@ class AnalyticsEndpoint(BaseAPIView):
)
cycle_details = {}
if x_axis in ["issue_cycle__cycle_id"] or segment in ["issue_cycle__cycle_id"]:
if x_axis in ["issue_cycle__cycle_id"] or segment in [
"issue_cycle__cycle_id"
]:
cycle_details = (
Issue.issue_objects.filter(
workspace__slug=slug,
@@ -186,7 +192,9 @@ class AnalyticViewViewset(BaseViewSet):
def get_queryset(self):
return self.filter_queryset(
super().get_queryset().filter(workspace__slug=self.kwargs.get("slug"))
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
)
@@ -196,7 +204,9 @@ class SavedAnalyticEndpoint(BaseAPIView):
]
def get(self, request, slug, analytic_id):
analytic_view = AnalyticView.objects.get(pk=analytic_id, workspace__slug=slug)
analytic_view = AnalyticView.objects.get(
pk=analytic_id, workspace__slug=slug
)
filter = analytic_view.query
queryset = Issue.issue_objects.filter(**filter)
@@ -266,7 +276,9 @@ class ExportAnalyticsEndpoint(BaseAPIView):
)
# If segment is present it cannot be same as x-axis
if segment and (segment not in valid_xaxis_segment or x_axis == segment):
if segment and (
segment not in valid_xaxis_segment or x_axis == segment
):
return Response(
{
"error": "Both segment and x axis cannot be same and segment should be valid"
@@ -293,7 +305,9 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
def get(self, request, slug):
filters = issue_filters(request.GET, "GET")
base_issues = Issue.issue_objects.filter(workspace__slug=slug, **filters)
base_issues = Issue.issue_objects.filter(
workspace__slug=slug, **filters
)
total_issues = base_issues.count()
@@ -306,7 +320,9 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
)
open_issues_groups = ["backlog", "unstarted", "started"]
open_issues_queryset = state_groups.filter(state__group__in=open_issues_groups)
open_issues_queryset = state_groups.filter(
state__group__in=open_issues_groups
)
open_issues = open_issues_queryset.count()
open_issues_classified = (
@@ -361,10 +377,12 @@ class DefaultAnalyticsEndpoint(BaseAPIView):
.order_by("-count")
)
open_estimate_sum = open_issues_queryset.aggregate(sum=Sum("estimate_point"))[
open_estimate_sum = open_issues_queryset.aggregate(
sum=Sum("estimate_point")
)["sum"]
total_estimate_sum = base_issues.aggregate(sum=Sum("estimate_point"))[
"sum"
]
total_estimate_sum = base_issues.aggregate(sum=Sum("estimate_point"))["sum"]
return Response(
{

View File

@@ -71,7 +71,9 @@ class ApiTokenEndpoint(BaseAPIView):
user=request.user,
pk=pk,
)
serializer = APITokenSerializer(api_token, data=request.data, partial=True)
serializer = APITokenSerializer(
api_token, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -10,7 +10,11 @@ from plane.app.serializers import FileAssetSerializer
class FileAssetEndpoint(BaseAPIView):
parser_classes = (MultiPartParser, FormParser, JSONParser,)
parser_classes = (
MultiPartParser,
FormParser,
JSONParser,
)
"""
A viewset for viewing and editing task instances.
@@ -20,10 +24,18 @@ class FileAssetEndpoint(BaseAPIView):
asset_key = str(workspace_id) + "/" + asset_key
files = FileAsset.objects.filter(asset=asset_key)
if files.exists():
serializer = FileAssetSerializer(files, context={"request": request}, many=True)
return Response({"data": serializer.data, "status": True}, status=status.HTTP_200_OK)
serializer = FileAssetSerializer(
files, context={"request": request}, many=True
)
return Response(
{"data": serializer.data, "status": True},
status=status.HTTP_200_OK,
)
else:
return Response({"error": "Asset key does not exist", "status": False}, status=status.HTTP_200_OK)
return Response(
{"error": "Asset key does not exist", "status": False},
status=status.HTTP_200_OK,
)
def post(self, request, slug):
serializer = FileAssetSerializer(data=request.data)
@@ -43,7 +55,6 @@ class FileAssetEndpoint(BaseAPIView):
class FileAssetViewSet(BaseViewSet):
def restore(self, request, workspace_id, asset_key):
asset_key = str(workspace_id) + "/" + asset_key
file_asset = FileAsset.objects.get(asset=asset_key)
@@ -56,12 +67,22 @@ class UserAssetsEndpoint(BaseAPIView):
parser_classes = (MultiPartParser, FormParser)
def get(self, request, asset_key):
files = FileAsset.objects.filter(asset=asset_key, created_by=request.user)
files = FileAsset.objects.filter(
asset=asset_key, created_by=request.user
)
if files.exists():
serializer = FileAssetSerializer(files, context={"request": request})
return Response({"data": serializer.data, "status": True}, status=status.HTTP_200_OK)
serializer = FileAssetSerializer(
files, context={"request": request}
)
return Response(
{"data": serializer.data, "status": True},
status=status.HTTP_200_OK,
)
else:
return Response({"error": "Asset key does not exist", "status": False}, status=status.HTTP_200_OK)
return Response(
{"error": "Asset key does not exist", "status": False},
status=status.HTTP_200_OK,
)
def post(self, request):
serializer = FileAssetSerializer(data=request.data)
@@ -70,9 +91,10 @@ class UserAssetsEndpoint(BaseAPIView):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, asset_key):
file_asset = FileAsset.objects.get(asset=asset_key, created_by=request.user)
file_asset = FileAsset.objects.get(
asset=asset_key, created_by=request.user
)
file_asset.is_deleted = True
file_asset.save()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -128,7 +128,8 @@ class ForgotPasswordEndpoint(BaseAPIView):
status=status.HTTP_200_OK,
)
return Response(
{"error": "Please check the email"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Please check the email"},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -167,7 +168,9 @@ class ResetPasswordEndpoint(BaseAPIView):
}
return Response(data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except DjangoUnicodeDecodeError as indentifier:
return Response(
@@ -191,7 +194,8 @@ class ChangePasswordEndpoint(BaseAPIView):
user.is_password_autoset = False
user.save()
return Response(
{"message": "Password updated successfully"}, status=status.HTTP_200_OK
{"message": "Password updated successfully"},
status=status.HTTP_200_OK,
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
@@ -213,7 +217,8 @@ class SetUserPasswordEndpoint(BaseAPIView):
# Check password validation
if not password and len(str(password)) < 8:
return Response(
{"error": "Password is not valid"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Password is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
# Set the user password
@@ -281,7 +286,9 @@ class MagicGenerateEndpoint(BaseAPIView):
if data["current_attempt"] > 2:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
{
"error": "Max attempts exhausted. Please try again later."
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -339,7 +346,8 @@ class EmailCheckEndpoint(BaseAPIView):
if not email:
return Response(
{"error": "Email is required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Email is required"},
status=status.HTTP_400_BAD_REQUEST,
)
# validate the email
@@ -347,7 +355,8 @@ class EmailCheckEndpoint(BaseAPIView):
validate_email(email)
except ValidationError:
return Response(
{"error": "Email is not valid"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Email is not valid"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check if the user exists
@@ -399,13 +408,18 @@ class EmailCheckEndpoint(BaseAPIView):
key, token, current_attempt = generate_magic_token(email=email)
if not current_attempt:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
{
"error": "Max attempts exhausted. Please try again later."
},
status=status.HTTP_400_BAD_REQUEST,
)
# Trigger the email
magic_link.delay(email, "magic_" + str(email), token, current_site)
return Response(
{"is_password_autoset": user.is_password_autoset, "is_existing": False},
{
"is_password_autoset": user.is_password_autoset,
"is_existing": False,
},
status=status.HTTP_200_OK,
)
@@ -433,7 +447,9 @@ class EmailCheckEndpoint(BaseAPIView):
key, token, current_attempt = generate_magic_token(email=email)
if not current_attempt:
return Response(
{"error": "Max attempts exhausted. Please try again later."},
{
"error": "Max attempts exhausted. Please try again later."
},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -73,7 +73,7 @@ class SignUpEndpoint(BaseAPIView):
# get configuration values
# Get configuration values
ENABLE_SIGNUP, = get_configuration_value(
(ENABLE_SIGNUP,) = get_configuration_value(
[
{
"key": "ENABLE_SIGNUP",
@@ -173,7 +173,7 @@ class SignInEndpoint(BaseAPIView):
# Create the user
else:
ENABLE_SIGNUP, = get_configuration_value(
(ENABLE_SIGNUP,) = get_configuration_value(
[
{
"key": "ENABLE_SIGNUP",
@@ -364,9 +364,11 @@ class MagicSignInEndpoint(BaseAPIView):
user.save()
# Check if user has any accepted invites for workspace and add them to workspace
workspace_member_invites = WorkspaceMemberInvite.objects.filter(
workspace_member_invites = (
WorkspaceMemberInvite.objects.filter(
email=user.email, accepted=True
)
)
WorkspaceMember.objects.bulk_create(
[
@@ -431,7 +433,9 @@ class MagicSignInEndpoint(BaseAPIView):
else:
return Response(
{"error": "Your login code was incorrect. Please try again."},
{
"error": "Your login code was incorrect. Please try again."
},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -46,7 +46,9 @@ class WebhookMixin:
bulk = False
def finalize_response(self, request, response, *args, **kwargs):
response = super().finalize_response(request, response, *args, **kwargs)
response = super().finalize_response(
request, response, *args, **kwargs
)
# Check for the case should webhook be sent
if (
@@ -88,7 +90,9 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
return self.model.objects.all()
except Exception as e:
capture_exception(e)
raise APIException("Please check the view", status.HTTP_400_BAD_REQUEST)
raise APIException(
"Please check the view", status.HTTP_400_BAD_REQUEST
)
def handle_exception(self, exc):
"""
@@ -99,6 +103,7 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
response = super().handle_exception(exc)
return response
except Exception as e:
print(e) if settings.DEBUG else print("Server Error")
if isinstance(e, IntegrityError):
return Response(
{"error": "The payload is not valid"},
@@ -124,10 +129,11 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
status=status.HTTP_400_BAD_REQUEST,
)
print(e) if settings.DEBUG else print("Server Error")
capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
return Response(
{"error": "Something went wrong please try again later"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def dispatch(self, request, *args, **kwargs):
try:
@@ -158,6 +164,24 @@ class BaseViewSet(TimezoneMixin, ModelViewSet, BasePaginator):
if resolve(self.request.path_info).url_name == "project":
return self.kwargs.get("pk", None)
@property
def fields(self):
fields = [
field
for field in self.request.GET.get("fields", "").split(",")
if field
]
return fields if fields else None
@property
def expand(self):
expand = [
expand
for expand in self.request.GET.get("expand", "").split(",")
if expand
]
return expand if expand else None
class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
permission_classes = [
@@ -206,13 +230,18 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
)
if isinstance(e, KeyError):
return Response({"error": f"The required key does not exist."}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": f"The required key does not exist."},
status=status.HTTP_400_BAD_REQUEST,
)
if settings.DEBUG:
print(e)
capture_exception(e)
return Response({"error": "Something went wrong please try again later"}, status=status.HTTP_500_INTERNAL_SERVER_ERROR)
return Response(
{"error": "Something went wrong please try again later"},
status=status.HTTP_500_INTERNAL_SERVER_ERROR,
)
def dispatch(self, request, *args, **kwargs):
try:
@@ -237,3 +266,21 @@ class BaseAPIView(TimezoneMixin, APIView, BasePaginator):
@property
def project_id(self):
return self.kwargs.get("project_id", None)
@property
def fields(self):
fields = [
field
for field in self.request.GET.get("fields", "").split(",")
if field
]
return fields if fields else None
@property
def expand(self):
expand = [
expand
for expand in self.request.GET.get("expand", "").split(",")
if expand
]
return expand if expand else None

View File

@@ -20,7 +20,6 @@ class ConfigurationEndpoint(BaseAPIView):
]
def get(self, request):
# Get all the configuration
(
GOOGLE_CLIENT_ID,
@@ -90,8 +89,16 @@ class ConfigurationEndpoint(BaseAPIView):
data = {}
# Authentication
data["google_client_id"] = GOOGLE_CLIENT_ID if GOOGLE_CLIENT_ID and GOOGLE_CLIENT_ID != "\"\"" else None
data["github_client_id"] = GITHUB_CLIENT_ID if GITHUB_CLIENT_ID and GITHUB_CLIENT_ID != "\"\"" else None
data["google_client_id"] = (
GOOGLE_CLIENT_ID
if GOOGLE_CLIENT_ID and GOOGLE_CLIENT_ID != '""'
else None
)
data["github_client_id"] = (
GITHUB_CLIENT_ID
if GITHUB_CLIENT_ID and GITHUB_CLIENT_ID != '""'
else None
)
data["github_app_name"] = GITHUB_APP_NAME
data["magic_login"] = (
bool(EMAIL_HOST_USER) and bool(EMAIL_HOST_PASSWORD)
@@ -112,9 +119,129 @@ class ConfigurationEndpoint(BaseAPIView):
data["has_openai_configured"] = bool(OPENAI_API_KEY)
# File size settings
data["file_size_limit"] = float(os.environ.get("FILE_SIZE_LIMIT", 5242880))
data["file_size_limit"] = float(
os.environ.get("FILE_SIZE_LIMIT", 5242880)
)
# is self managed
data["is_self_managed"] = bool(int(os.environ.get("IS_SELF_MANAGED", "1")))
# is smtp configured
data["is_smtp_configured"] = bool(EMAIL_HOST_USER) and bool(
EMAIL_HOST_PASSWORD
)
return Response(data, status=status.HTTP_200_OK)
class MobileConfigurationEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request):
(
GOOGLE_CLIENT_ID,
GOOGLE_SERVER_CLIENT_ID,
GOOGLE_IOS_CLIENT_ID,
EMAIL_HOST_USER,
EMAIL_HOST_PASSWORD,
ENABLE_MAGIC_LINK_LOGIN,
ENABLE_EMAIL_PASSWORD,
POSTHOG_API_KEY,
POSTHOG_HOST,
UNSPLASH_ACCESS_KEY,
OPENAI_API_KEY,
) = get_configuration_value(
[
{
"key": "GOOGLE_CLIENT_ID",
"default": os.environ.get("GOOGLE_CLIENT_ID", None),
},
{
"key": "GOOGLE_SERVER_CLIENT_ID",
"default": os.environ.get("GOOGLE_SERVER_CLIENT_ID", None),
},
{
"key": "GOOGLE_IOS_CLIENT_ID",
"default": os.environ.get("GOOGLE_IOS_CLIENT_ID", None),
},
{
"key": "EMAIL_HOST_USER",
"default": os.environ.get("EMAIL_HOST_USER", None),
},
{
"key": "EMAIL_HOST_PASSWORD",
"default": os.environ.get("EMAIL_HOST_PASSWORD", None),
},
{
"key": "ENABLE_MAGIC_LINK_LOGIN",
"default": os.environ.get("ENABLE_MAGIC_LINK_LOGIN", "1"),
},
{
"key": "ENABLE_EMAIL_PASSWORD",
"default": os.environ.get("ENABLE_EMAIL_PASSWORD", "1"),
},
{
"key": "POSTHOG_API_KEY",
"default": os.environ.get("POSTHOG_API_KEY", "1"),
},
{
"key": "POSTHOG_HOST",
"default": os.environ.get("POSTHOG_HOST", "1"),
},
{
"key": "UNSPLASH_ACCESS_KEY",
"default": os.environ.get("UNSPLASH_ACCESS_KEY", "1"),
},
{
"key": "OPENAI_API_KEY",
"default": os.environ.get("OPENAI_API_KEY", "1"),
},
]
)
data = {}
# Authentication
data["google_client_id"] = (
GOOGLE_CLIENT_ID
if GOOGLE_CLIENT_ID and GOOGLE_CLIENT_ID != '""'
else None
)
data["google_server_client_id"] = (
GOOGLE_SERVER_CLIENT_ID
if GOOGLE_SERVER_CLIENT_ID and GOOGLE_SERVER_CLIENT_ID != '""'
else None
)
data["google_ios_client_id"] = (
(GOOGLE_IOS_CLIENT_ID)[::-1]
if GOOGLE_IOS_CLIENT_ID is not None
else None
)
# Posthog
data["posthog_api_key"] = POSTHOG_API_KEY
data["posthog_host"] = POSTHOG_HOST
data["magic_login"] = (
bool(EMAIL_HOST_USER) and bool(EMAIL_HOST_PASSWORD)
) and ENABLE_MAGIC_LINK_LOGIN == "1"
data["email_password_login"] = ENABLE_EMAIL_PASSWORD == "1"
# Posthog
data["posthog_api_key"] = POSTHOG_API_KEY
data["posthog_host"] = POSTHOG_HOST
# Unsplash
data["has_unsplash_configured"] = bool(UNSPLASH_ACCESS_KEY)
# Open AI settings
data["has_openai_configured"] = bool(OPENAI_API_KEY)
# File size settings
data["file_size_limit"] = float(
os.environ.get("FILE_SIZE_LIMIT", 5242880)
)
# is smtp configured
data["is_smtp_configured"] = not (
bool(EMAIL_HOST_USER) and bool(EMAIL_HOST_PASSWORD)
)
return Response(data, status=status.HTTP_200_OK)

View File

@@ -14,7 +14,7 @@ from django.db.models import (
Case,
When,
Value,
CharField
CharField,
)
from django.core import serializers
from django.utils import timezone
@@ -31,10 +31,16 @@ from plane.app.serializers import (
CycleSerializer,
CycleIssueSerializer,
CycleFavoriteSerializer,
IssueSerializer,
IssueStateSerializer,
CycleWriteSerializer,
CycleUserPropertiesSerializer,
)
from plane.app.permissions import (
ProjectEntityPermission,
ProjectLitePermission,
WorkspaceUserPermission
)
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
User,
Cycle,
@@ -44,9 +50,10 @@ from plane.db.models import (
IssueLink,
IssueAttachment,
Label,
CycleUserProperties,
IssueSubscriber,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
from plane.utils.analytics_plot import burndown_plot
@@ -61,7 +68,8 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
def perform_create(self, serializer):
serializer.save(
project_id=self.kwargs.get("project_id"), owned_by=self.request.user
project_id=self.kwargs.get("project_id"),
owned_by=self.request.user,
)
def get_queryset(self):
@@ -140,7 +148,9 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
),
)
)
.annotate(total_estimates=Sum("issue_cycle__issue__estimate_point"))
.annotate(
total_estimates=Sum("issue_cycle__issue__estimate_point")
)
.annotate(
completed_estimates=Sum(
"issue_cycle__issue__estimate_point",
@@ -164,20 +174,17 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
.annotate(
status=Case(
When(
Q(start_date__lte=timezone.now()) & Q(end_date__gte=timezone.now()),
then=Value("CURRENT")
Q(start_date__lte=timezone.now())
& Q(end_date__gte=timezone.now()),
then=Value("CURRENT"),
),
When(
start_date__gt=timezone.now(),
then=Value("UPCOMING")
),
When(
end_date__lt=timezone.now(),
then=Value("COMPLETED")
start_date__gt=timezone.now(), then=Value("UPCOMING")
),
When(end_date__lt=timezone.now(), then=Value("COMPLETED")),
When(
Q(start_date__isnull=True) & Q(end_date__isnull=True),
then=Value("DRAFT")
then=Value("DRAFT"),
),
default=Value("DRAFT"),
output_field=CharField(),
@@ -186,13 +193,17 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
.prefetch_related(
Prefetch(
"issue_cycle__issue__assignees",
queryset=User.objects.only("avatar", "first_name", "id").distinct(),
queryset=User.objects.only(
"avatar", "first_name", "id"
).distinct(),
)
)
.prefetch_related(
Prefetch(
"issue_cycle__issue__labels",
queryset=Label.objects.only("name", "color", "id").distinct(),
queryset=Label.objects.only(
"name", "color", "id"
).distinct(),
)
)
.order_by("-is_favorite", "name")
@@ -202,6 +213,11 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
def list(self, request, slug, project_id):
queryset = self.get_queryset()
cycle_view = request.GET.get("cycle_view", "all")
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
queryset = queryset.order_by("-is_favorite", "-created_at")
@@ -298,7 +314,9 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
"completion_chart": {},
}
if data[0]["start_date"] and data[0]["end_date"]:
data[0]["distribution"]["completion_chart"] = burndown_plot(
data[0]["distribution"][
"completion_chart"
] = burndown_plot(
queryset=queryset.first(),
slug=slug,
project_id=project_id,
@@ -307,44 +325,8 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
return Response(data, status=status.HTTP_200_OK)
# Upcoming Cycles
if cycle_view == "upcoming":
queryset = queryset.filter(start_date__gt=timezone.now())
return Response(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Completed Cycles
if cycle_view == "completed":
queryset = queryset.filter(end_date__lt=timezone.now())
return Response(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Draft Cycles
if cycle_view == "draft":
queryset = queryset.filter(
end_date=None,
start_date=None,
)
return Response(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# Incomplete Cycles
if cycle_view == "incomplete":
queryset = queryset.filter(
Q(end_date__gte=timezone.now().date()) | Q(end_date__isnull=True),
)
return Response(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
# If no matching view is found return all cycles
return Response(
CycleSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
cycles = CycleSerializer(queryset, many=True).data
return Response(cycles, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
if (
@@ -360,8 +342,18 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
project_id=project_id,
owned_by=request.user,
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
cycle = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = CycleSerializer(cycle)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else:
return Response(
{
@@ -371,15 +363,22 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
)
def partial_update(self, request, slug, project_id, pk):
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
request_data = request.data
if cycle.end_date is not None and cycle.end_date < timezone.now().date():
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
if "sort_order" in request_data:
# Can only change sort order
request_data = {
"sort_order": request_data.get("sort_order", cycle.sort_order)
"sort_order": request_data.get(
"sort_order", cycle.sort_order
)
}
else:
return Response(
@@ -389,7 +388,9 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
status=status.HTTP_400_BAD_REQUEST,
)
serializer = CycleWriteSerializer(cycle, data=request.data, partial=True)
serializer = CycleWriteSerializer(
cycle, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -410,7 +411,13 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.annotate(display_name=F("assignees__display_name"))
.values("first_name", "last_name", "assignee_id", "avatar", "display_name")
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"display_name",
)
.annotate(
total_issues=Count(
"assignee_id",
@@ -489,7 +496,10 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
if queryset.start_date and queryset.end_date:
data["distribution"]["completion_chart"] = burndown_plot(
queryset=queryset, slug=slug, project_id=project_id, cycle_id=pk
queryset=queryset,
slug=slug,
project_id=project_id,
cycle_id=pk,
)
return Response(
@@ -499,11 +509,13 @@ class CycleViewSet(WebhookMixin, BaseViewSet):
def destroy(self, request, slug, project_id, pk):
cycle_issues = list(
CycleIssue.objects.filter(cycle_id=self.kwargs.get("pk")).values_list(
"issue", flat=True
CycleIssue.objects.filter(
cycle_id=self.kwargs.get("pk")
).values_list("issue", flat=True)
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
cycle = Cycle.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue_activity.delay(
type="cycle.activity.deleted",
@@ -546,7 +558,9 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
super()
.get_queryset()
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue_id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("issue_id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -565,18 +579,23 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, cycle_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
order_by = request.GET.get("order_by", "created_at")
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.issue_objects.filter(issue_cycle__cycle_id=cycle_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_cycle__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
@@ -587,6 +606,8 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
.prefetch_related("labels")
.order_by(order_by)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -594,32 +615,43 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
subscriber=self.request.user, issue_id=OuterRef("id")
)
issues = IssueStateSerializer(
)
)
)
serializer = IssueSerializer(
issues, many=True, fields=fields if fields else None
).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
)
return Response(serializer.data, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, cycle_id):
issues = request.data.get("issues", [])
if not len(issues):
return Response(
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Issues are required"},
status=status.HTTP_400_BAD_REQUEST,
)
cycle = Cycle.objects.get(
workspace__slug=slug, project_id=project_id, pk=cycle_id
)
if cycle.end_date is not None and cycle.end_date < timezone.now().date():
if (
cycle.end_date is not None
and cycle.end_date < timezone.now().date()
):
return Response(
{
"error": "The Cycle has already been completed so no new issues can be added"
@@ -693,16 +725,22 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
)
# Return all Cycle Issues
issues = self.get_queryset().values_list("issue_id", flat=True)
return Response(
CycleIssueSerializer(self.get_queryset(), many=True).data,
IssueSerializer(
Issue.objects.filter(pk__in=issues), many=True
).data,
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, project_id, cycle_id, pk):
def destroy(self, request, slug, project_id, cycle_id, issue_id):
cycle_issue = CycleIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, cycle_id=cycle_id
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
cycle_id=cycle_id,
)
issue_id = cycle_issue.issue_id
issue_activity.delay(
type="cycle.activity.deleted",
requested_data=json.dumps(
@@ -712,7 +750,7 @@ class CycleIssueViewSet(WebhookMixin, BaseViewSet):
}
),
actor_id=str(self.request.user.id),
issue_id=str(cycle_issue.issue_id),
issue_id=str(issue_id),
project_id=str(self.kwargs.get("project_id", None)),
current_instance=None,
epoch=int(timezone.now().timestamp()),
@@ -834,3 +872,273 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
return Response({"message": "Success"}, status=status.HTTP_200_OK)
class CycleUserPropertiesEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def patch(self, request, slug, project_id, cycle_id):
cycle_properties = CycleUserProperties.objects.get(
user=request.user,
cycle_id=cycle_id,
project_id=project_id,
workspace__slug=slug,
)
cycle_properties.filters = request.data.get(
"filters", cycle_properties.filters
)
cycle_properties.display_filters = request.data.get(
"display_filters", cycle_properties.display_filters
)
cycle_properties.display_properties = request.data.get(
"display_properties", cycle_properties.display_properties
)
cycle_properties.save()
serializer = CycleUserPropertiesSerializer(cycle_properties)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request, slug, project_id, cycle_id):
cycle_properties, _ = CycleUserProperties.objects.get_or_create(
user=request.user,
project_id=project_id,
cycle_id=cycle_id,
workspace__slug=slug,
)
serializer = CycleUserPropertiesSerializer(cycle_properties)
return Response(serializer.data, status=status.HTTP_200_OK)
class ActiveCycleEndpoint(BaseAPIView):
permission_classes = [
WorkspaceUserPermission,
]
def get(self, request, slug):
subquery = CycleFavorite.objects.filter(
user=self.request.user,
cycle_id=OuterRef("pk"),
project_id=self.kwargs.get("project_id"),
workspace__slug=self.kwargs.get("slug"),
)
active_cycles = (
Cycle.objects.filter(
workspace__slug=slug,
project__project_projectmember__member=self.request.user,
start_date__lte=timezone.now(),
end_date__gte=timezone.now(),
)
.select_related("project")
.select_related("workspace")
.select_related("owned_by")
.annotate(is_favorite=Exists(subquery))
.annotate(
total_issues=Count(
"issue_cycle",
filter=Q(
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
completed_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
cancelled_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="cancelled",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
started_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
unstarted_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="unstarted",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
backlog_issues=Count(
"issue_cycle__issue__state__group",
filter=Q(
issue_cycle__issue__state__group="backlog",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(total_estimates=Sum("issue_cycle__issue__estimate_point"))
.annotate(
completed_estimates=Sum(
"issue_cycle__issue__estimate_point",
filter=Q(
issue_cycle__issue__state__group="completed",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
started_estimates=Sum(
"issue_cycle__issue__estimate_point",
filter=Q(
issue_cycle__issue__state__group="started",
issue_cycle__issue__archived_at__isnull=True,
issue_cycle__issue__is_draft=False,
),
)
)
.annotate(
status=Case(
When(
Q(start_date__lte=timezone.now())
& Q(end_date__gte=timezone.now()),
then=Value("CURRENT"),
),
When(start_date__gt=timezone.now(), then=Value("UPCOMING")),
When(end_date__lt=timezone.now(), then=Value("COMPLETED")),
When(
Q(start_date__isnull=True) & Q(end_date__isnull=True),
then=Value("DRAFT"),
),
default=Value("DRAFT"),
output_field=CharField(),
)
)
.prefetch_related(
Prefetch(
"issue_cycle__issue__assignees",
queryset=User.objects.only("avatar", "first_name", "id").distinct(),
)
)
.prefetch_related(
Prefetch(
"issue_cycle__issue__labels",
queryset=Label.objects.only("name", "color", "id").distinct(),
)
)
.order_by("-created_at")
)
cycles = CycleSerializer(active_cycles, many=True).data
for cycle in cycles:
assignee_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=cycle["id"],
project_id=cycle["project"],
workspace__slug=slug,
)
.annotate(display_name=F("assignees__display_name"))
.annotate(assignee_id=F("assignees__id"))
.annotate(avatar=F("assignees__avatar"))
.values("display_name", "assignee_id", "avatar")
.annotate(
total_issues=Count(
"assignee_id",
filter=Q(archived_at__isnull=True, is_draft=False),
),
)
.annotate(
completed_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"assignee_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("display_name")
)
label_distribution = (
Issue.objects.filter(
issue_cycle__cycle_id=cycle["id"],
project_id=cycle["project"],
workspace__slug=slug,
)
.annotate(label_name=F("labels__name"))
.annotate(color=F("labels__color"))
.annotate(label_id=F("labels__id"))
.values("label_name", "color", "label_id")
.annotate(
total_issues=Count(
"label_id",
filter=Q(archived_at__isnull=True, is_draft=False),
)
)
.annotate(
completed_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
is_draft=False,
),
)
)
.annotate(
pending_issues=Count(
"label_id",
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
is_draft=False,
),
)
)
.order_by("label_name")
)
cycle["distribution"] = {
"assignees": assignee_distribution,
"labels": label_distribution,
"completion_chart": {},
}
if cycle["start_date"] and cycle["end_date"]:
cycle["distribution"][
"completion_chart"
] = burndown_plot(
queryset=active_cycles.get(pk=cycle["id"]),
slug=slug,
project_id=cycle["project"],
cycle_id=cycle["id"],
)
return Response(cycles, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,658 @@
# Django imports
from django.db.models import (
Q,
Case,
When,
Value,
CharField,
Count,
F,
Exists,
OuterRef,
Max,
Subquery,
JSONField,
Func,
Prefetch,
)
from django.utils import timezone
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
# Module imports
from . import BaseAPIView
from plane.db.models import (
Issue,
IssueActivity,
ProjectMember,
Widget,
DashboardWidget,
Dashboard,
Project,
IssueLink,
IssueAttachment,
IssueRelation,
)
from plane.app.serializers import (
IssueActivitySerializer,
IssueSerializer,
DashboardSerializer,
WidgetSerializer,
)
from plane.utils.issue_filters import issue_filters
def dashboard_overview_stats(self, request, slug):
assigned_issues = Issue.issue_objects.filter(
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
).count()
pending_issues_count = Issue.issue_objects.filter(
~Q(state__group__in=["completed", "cancelled"]),
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
workspace__slug=slug,
assignees__in=[request.user],
).count()
created_issues_count = Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
created_by_id=request.user.id,
).count()
completed_issues_count = Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
assignees__in=[request.user],
state__group="completed",
).count()
return Response(
{
"assigned_issues_count": assigned_issues,
"pending_issues_count": pending_issues_count,
"completed_issues_count": completed_issues_count,
"created_issues_count": created_issues_count,
},
status=status.HTTP_200_OK,
)
def dashboard_assigned_issues(self, request, slug):
filters = issue_filters(request.query_params, "GET")
issue_type = request.GET.get("issue_type", None)
# get all the assigned issues
assigned_issues = (
Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
assignees__in=[request.user],
)
.filter(**filters)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels")
.prefetch_related(
Prefetch(
"issue_relation",
queryset=IssueRelation.objects.select_related(
"related_issue"
).select_related("issue"),
)
)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.order_by("created_at")
)
# Priority Ordering
priority_order = ["urgent", "high", "medium", "low", "none"]
assigned_issues = assigned_issues.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
if issue_type == "completed":
completed_issues_count = assigned_issues.filter(
state__group__in=["completed"]
).count()
completed_issues = assigned_issues.filter(
state__group__in=["completed"]
)[:5]
return Response(
{
"issues": IssueSerializer(
completed_issues, many=True, expand=self.expand
).data,
"count": completed_issues_count,
},
status=status.HTTP_200_OK,
)
if issue_type == "overdue":
overdue_issues_count = assigned_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__lt=timezone.now()
).count()
overdue_issues = assigned_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__lt=timezone.now()
)[:5]
return Response(
{
"issues": IssueSerializer(
overdue_issues, many=True, expand=self.expand
).data,
"count": overdue_issues_count,
},
status=status.HTTP_200_OK,
)
if issue_type == "upcoming":
upcoming_issues_count = assigned_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__gte=timezone.now()
).count()
upcoming_issues = assigned_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__gte=timezone.now()
)[:5]
return Response(
{
"issues": IssueSerializer(
upcoming_issues, many=True, expand=self.expand
).data,
"count": upcoming_issues_count,
},
status=status.HTTP_200_OK,
)
return Response(
{"error": "Please specify a valid issue type"},
status=status.HTTP_400_BAD_REQUEST,
)
def dashboard_created_issues(self, request, slug):
filters = issue_filters(request.query_params, "GET")
issue_type = request.GET.get("issue_type", None)
# get all the assigned issues
created_issues = (
Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
created_by=request.user,
)
.filter(**filters)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.order_by("created_at")
)
# Priority Ordering
priority_order = ["urgent", "high", "medium", "low", "none"]
created_issues = created_issues.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
if issue_type == "completed":
completed_issues_count = created_issues.filter(
state__group__in=["completed"]
).count()
completed_issues = created_issues.filter(
state__group__in=["completed"]
)[:5]
return Response(
{
"issues": IssueSerializer(completed_issues, many=True).data,
"count": completed_issues_count,
},
status=status.HTTP_200_OK,
)
if issue_type == "overdue":
overdue_issues_count = created_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__lt=timezone.now()
).count()
overdue_issues = created_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__lt=timezone.now()
)[:5]
return Response(
{
"issues": IssueSerializer(overdue_issues, many=True).data,
"count": overdue_issues_count,
},
status=status.HTTP_200_OK,
)
if issue_type == "upcoming":
upcoming_issues_count = created_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__gte=timezone.now()
).count()
upcoming_issues = created_issues.filter(
state__group__in=["backlog", "unstarted", "started"],
target_date__gte=timezone.now()
)[:5]
return Response(
{
"issues": IssueSerializer(upcoming_issues, many=True).data,
"count": upcoming_issues_count,
},
status=status.HTTP_200_OK,
)
return Response(
{"error": "Please specify a valid issue type"},
status=status.HTTP_400_BAD_REQUEST,
)
def dashboard_issues_by_state_groups(self, request, slug):
filters = issue_filters(request.query_params, "GET")
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
issues_by_state_groups = (
Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
assignees__in=[request.user],
)
.filter(**filters)
.values("state__group")
.annotate(count=Count("id"))
)
# default state
all_groups = {state: 0 for state in state_order}
# Update counts for existing groups
for entry in issues_by_state_groups:
all_groups[entry["state__group"]] = entry["count"]
# Prepare output including all groups with their counts
output_data = [
{"state": group, "count": count} for group, count in all_groups.items()
]
return Response(output_data, status=status.HTTP_200_OK)
def dashboard_issues_by_priority(self, request, slug):
filters = issue_filters(request.query_params, "GET")
priority_order = ["urgent", "high", "medium", "low", "none"]
issues_by_priority = (
Issue.issue_objects.filter(
workspace__slug=slug,
project__project_projectmember__is_active=True,
project__project_projectmember__member=request.user,
assignees__in=[request.user],
)
.filter(**filters)
.values("priority")
.annotate(count=Count("id"))
)
# default priority
all_groups = {priority: 0 for priority in priority_order}
# Update counts for existing groups
for entry in issues_by_priority:
all_groups[entry["priority"]] = entry["count"]
# Prepare output including all groups with their counts
output_data = [
{"priority": group, "count": count}
for group, count in all_groups.items()
]
return Response(output_data, status=status.HTTP_200_OK)
def dashboard_recent_activity(self, request, slug):
queryset = IssueActivity.objects.filter(
~Q(field__in=["comment", "vote", "reaction", "draft"]),
workspace__slug=slug,
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
actor=request.user,
).select_related("actor", "workspace", "issue", "project")[:8]
return Response(
IssueActivitySerializer(queryset, many=True).data,
status=status.HTTP_200_OK,
)
def dashboard_recent_projects(self, request, slug):
project_ids = (
IssueActivity.objects.filter(
workspace__slug=slug,
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
actor=request.user,
)
.values_list("project_id", flat=True)
.distinct()
)
# Extract project IDs from the recent projects
unique_project_ids = set(project_id for project_id in project_ids)
# Fetch additional projects only if needed
if len(unique_project_ids) < 4:
additional_projects = Project.objects.filter(
project_projectmember__member=request.user,
project_projectmember__is_active=True,
workspace__slug=slug,
).exclude(id__in=unique_project_ids)
# Append additional project IDs to the existing list
unique_project_ids.update(additional_projects.values_list("id", flat=True))
return Response(
list(unique_project_ids)[:4],
status=status.HTTP_200_OK,
)
def dashboard_recent_collaborators(self, request, slug):
# Fetch all project IDs where the user belongs to
user_projects = Project.objects.filter(
project_projectmember__member=request.user,
project_projectmember__is_active=True,
workspace__slug=slug,
).values_list("id", flat=True)
# Fetch all users who have performed an activity in the projects where the user exists
users_with_activities = (
IssueActivity.objects.filter(
workspace__slug=slug,
project_id__in=user_projects,
)
.values("actor")
.exclude(actor=request.user)
.annotate(num_activities=Count("actor"))
.order_by("-num_activities")
)[:7]
# Get the count of active issues for each user in users_with_activities
users_with_active_issues = []
for user_activity in users_with_activities:
user_id = user_activity["actor"]
active_issue_count = Issue.objects.filter(
assignees__in=[user_id],
state__group__in=["unstarted", "started"],
).count()
users_with_active_issues.append(
{"user_id": user_id, "active_issue_count": active_issue_count}
)
# Insert the logged-in user's ID and their active issue count at the beginning
active_issue_count = Issue.objects.filter(
assignees__in=[request.user],
state__group__in=["unstarted", "started"],
).count()
if users_with_activities.count() < 7:
# Calculate the additional collaborators needed
additional_collaborators_needed = 7 - users_with_activities.count()
# Fetch additional collaborators from the project_member table
additional_collaborators = list(
set(
ProjectMember.objects.filter(
~Q(member=request.user),
project_id__in=user_projects,
workspace__slug=slug,
)
.exclude(
member__in=[
user["actor"] for user in users_with_activities
]
)
.values_list("member", flat=True)
)
)
additional_collaborators = additional_collaborators[
:additional_collaborators_needed
]
# Append additional collaborators to the list
for collaborator_id in additional_collaborators:
active_issue_count = Issue.objects.filter(
assignees__in=[collaborator_id],
state__group__in=["unstarted", "started"],
).count()
users_with_active_issues.append(
{
"user_id": str(collaborator_id),
"active_issue_count": active_issue_count,
}
)
users_with_active_issues.insert(
0,
{"user_id": request.user.id, "active_issue_count": active_issue_count},
)
return Response(users_with_active_issues, status=status.HTTP_200_OK)
class DashboardEndpoint(BaseAPIView):
def create(self, request, slug):
serializer = DashboardSerializer(data=request.data)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def patch(self, request, slug, pk):
serializer = DashboardSerializer(data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def delete(self, request, slug, pk):
serializer = DashboardSerializer(data=request.data, partial=True)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_204_NO_CONTENT)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def get(self, request, slug, dashboard_id=None):
if not dashboard_id:
dashboard_type = request.GET.get("dashboard_type", None)
if dashboard_type == "home":
dashboard, created = Dashboard.objects.get_or_create(
type_identifier=dashboard_type, owned_by=request.user, is_default=True
)
if created:
widgets_to_fetch = [
"overview_stats",
"assigned_issues",
"created_issues",
"issues_by_state_groups",
"issues_by_priority",
"recent_activity",
"recent_projects",
"recent_collaborators",
]
updated_dashboard_widgets = []
for widget_key in widgets_to_fetch:
widget = Widget.objects.filter(key=widget_key).values_list("id", flat=True)
if widget:
updated_dashboard_widgets.append(
DashboardWidget(
widget_id=widget,
dashboard_id=dashboard.id,
)
)
DashboardWidget.objects.bulk_create(
updated_dashboard_widgets, batch_size=100
)
widgets = (
Widget.objects.annotate(
is_visible=Exists(
DashboardWidget.objects.filter(
widget_id=OuterRef("pk"),
dashboard_id=dashboard.id,
is_visible=True,
)
)
)
.annotate(
dashboard_filters=Subquery(
DashboardWidget.objects.filter(
widget_id=OuterRef("pk"),
dashboard_id=dashboard.id,
filters__isnull=False,
)
.exclude(filters={})
.values("filters")[:1]
)
)
.annotate(
widget_filters=Case(
When(
dashboard_filters__isnull=False,
then=F("dashboard_filters"),
),
default=F("filters"),
output_field=JSONField(),
)
)
)
return Response(
{
"dashboard": DashboardSerializer(dashboard).data,
"widgets": WidgetSerializer(widgets, many=True).data,
},
status=status.HTTP_200_OK,
)
return Response(
{"error": "Please specify a valid dashboard type"},
status=status.HTTP_400_BAD_REQUEST,
)
widget_key = request.GET.get("widget_key", "overview_stats")
WIDGETS_MAPPER = {
"overview_stats": dashboard_overview_stats,
"assigned_issues": dashboard_assigned_issues,
"created_issues": dashboard_created_issues,
"issues_by_state_groups": dashboard_issues_by_state_groups,
"issues_by_priority": dashboard_issues_by_priority,
"recent_activity": dashboard_recent_activity,
"recent_projects": dashboard_recent_projects,
"recent_collaborators": dashboard_recent_collaborators,
}
func = WIDGETS_MAPPER.get(widget_key)
if func is not None:
response = func(
self,
request=request,
slug=slug,
)
if isinstance(response, Response):
return response
return Response(
{"error": "Please specify a valid widget key"},
status=status.HTTP_400_BAD_REQUEST,
)
class WidgetsEndpoint(BaseAPIView):
def patch(self, request, dashboard_id, widget_id):
dashboard_widget = DashboardWidget.objects.filter(
widget_id=widget_id,
dashboard_id=dashboard_id,
).first()
dashboard_widget.is_visible = request.data.get(
"is_visible", dashboard_widget.is_visible
)
dashboard_widget.sort_order = request.data.get(
"sort_order", dashboard_widget.sort_order
)
dashboard_widget.filters = request.data.get(
"filters", dashboard_widget.filters
)
dashboard_widget.save()
return Response(
{"message": "successfully updated"}, status=status.HTTP_200_OK
)

View File

@@ -39,9 +39,13 @@ class BulkEstimatePointEndpoint(BaseViewSet):
serializer_class = EstimateSerializer
def list(self, request, slug, project_id):
estimates = Estimate.objects.filter(
estimates = (
Estimate.objects.filter(
workspace__slug=slug, project_id=project_id
).prefetch_related("points").select_related("workspace", "project")
)
.prefetch_related("points")
.select_related("workspace", "project")
)
serializer = EstimateReadSerializer(estimates, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -54,13 +58,17 @@ class BulkEstimatePointEndpoint(BaseViewSet):
estimate_points = request.data.get("estimate_points", [])
serializer = EstimatePointSerializer(data=request.data.get("estimate_points"), many=True)
serializer = EstimatePointSerializer(
data=request.data.get("estimate_points"), many=True
)
if not serializer.is_valid():
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
estimate_serializer = EstimateSerializer(data=request.data.get("estimate"))
estimate_serializer = EstimateSerializer(
data=request.data.get("estimate")
)
if not estimate_serializer.is_valid():
return Response(
estimate_serializer.errors, status=status.HTTP_400_BAD_REQUEST
@@ -135,7 +143,8 @@ class BulkEstimatePointEndpoint(BaseViewSet):
estimate_points = EstimatePoint.objects.filter(
pk__in=[
estimate_point.get("id") for estimate_point in estimate_points_data
estimate_point.get("id")
for estimate_point in estimate_points_data
],
workspace__slug=slug,
project_id=project_id,
@@ -157,10 +166,14 @@ class BulkEstimatePointEndpoint(BaseViewSet):
updated_estimate_points.append(estimate_point)
EstimatePoint.objects.bulk_update(
updated_estimate_points, ["value"], batch_size=10,
updated_estimate_points,
["value"],
batch_size=10,
)
estimate_point_serializer = EstimatePointSerializer(estimate_points, many=True)
estimate_point_serializer = EstimatePointSerializer(
estimate_points, many=True
)
return Response(
{
"estimate": estimate_serializer.data,

View File

@@ -63,9 +63,11 @@ class ExportIssuesEndpoint(BaseAPIView):
def get(self, request, slug):
exporter_history = ExporterHistory.objects.filter(
workspace__slug=slug
).select_related("workspace","initiated_by")
).select_related("workspace", "initiated_by")
if request.GET.get("per_page", False) and request.GET.get("cursor", False):
if request.GET.get("per_page", False) and request.GET.get(
"cursor", False
):
return self.paginate(
request=request,
queryset=exporter_history,

View File

@@ -14,7 +14,10 @@ from django.conf import settings
from .base import BaseAPIView
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import Workspace, Project
from plane.app.serializers import ProjectLiteSerializer, WorkspaceLiteSerializer
from plane.app.serializers import (
ProjectLiteSerializer,
WorkspaceLiteSerializer,
)
from plane.utils.integrations.github import get_release_notes
from plane.license.utils.instance_value import get_configuration_value
@@ -51,7 +54,8 @@ class GPTIntegrationEndpoint(BaseAPIView):
if not task:
return Response(
{"error": "Task is required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Task is required"},
status=status.HTTP_400_BAD_REQUEST,
)
final_text = task + "\n" + prompt
@@ -89,7 +93,7 @@ class ReleaseNotesEndpoint(BaseAPIView):
class UnsplashEndpoint(BaseAPIView):
def get(self, request):
UNSPLASH_ACCESS_KEY, = get_configuration_value(
(UNSPLASH_ACCESS_KEY,) = get_configuration_value(
[
{
"key": "UNSPLASH_ACCESS_KEY",

View File

@@ -35,7 +35,10 @@ from plane.app.serializers import (
ModuleSerializer,
)
from plane.utils.integrations.github import get_github_repo_details
from plane.utils.importers.jira import jira_project_issue_summary, is_allowed_hostname
from plane.utils.importers.jira import (
jira_project_issue_summary,
is_allowed_hostname,
)
from plane.bgtasks.importer_task import service_importer
from plane.utils.html_processor import strip_tags
from plane.app.permissions import WorkSpaceAdminPermission
@@ -93,7 +96,8 @@ class ServiceIssueImportSummaryEndpoint(BaseAPIView):
for key, error_message in params.items():
if not request.GET.get(key, False):
return Response(
{"error": error_message}, status=status.HTTP_400_BAD_REQUEST
{"error": error_message},
status=status.HTTP_400_BAD_REQUEST,
)
project_key = request.GET.get("project_key", "")
@@ -236,7 +240,9 @@ class ImportServiceEndpoint(BaseAPIView):
return Response(serializer.data)
def delete(self, request, slug, service, pk):
importer = Importer.objects.get(pk=pk, service=service, workspace__slug=slug)
importer = Importer.objects.get(
pk=pk, service=service, workspace__slug=slug
)
if importer.imported_data is not None:
# Delete all imported Issues
@@ -254,8 +260,12 @@ class ImportServiceEndpoint(BaseAPIView):
return Response(status=status.HTTP_204_NO_CONTENT)
def patch(self, request, slug, service, pk):
importer = Importer.objects.get(pk=pk, service=service, workspace__slug=slug)
serializer = ImporterSerializer(importer, data=request.data, partial=True)
importer = Importer.objects.get(
pk=pk, service=service, workspace__slug=slug
)
serializer = ImporterSerializer(
importer, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -291,9 +301,9 @@ class BulkImportIssuesEndpoint(BaseAPIView):
).first()
# Get the maximum sequence_id
last_id = IssueSequence.objects.filter(project_id=project_id).aggregate(
largest=Max("sequence")
)["largest"]
last_id = IssueSequence.objects.filter(
project_id=project_id
).aggregate(largest=Max("sequence"))["largest"]
last_id = 1 if last_id is None else last_id + 1
@@ -326,7 +336,9 @@ class BulkImportIssuesEndpoint(BaseAPIView):
if issue_data.get("state", False)
else default_state.id,
name=issue_data.get("name", "Issue Created through Bulk"),
description_html=issue_data.get("description_html", "<p></p>"),
description_html=issue_data.get(
"description_html", "<p></p>"
),
description_stripped=(
None
if (
@@ -438,15 +450,21 @@ class BulkImportIssuesEndpoint(BaseAPIView):
for comment in comments_list
]
_ = IssueComment.objects.bulk_create(bulk_issue_comments, batch_size=100)
_ = IssueComment.objects.bulk_create(
bulk_issue_comments, batch_size=100
)
# Attach Links
_ = IssueLink.objects.bulk_create(
[
IssueLink(
issue=issue,
url=issue_data.get("link", {}).get("url", "https://github.com"),
title=issue_data.get("link", {}).get("title", "Original Issue"),
url=issue_data.get("link", {}).get(
"url", "https://github.com"
),
title=issue_data.get("link", {}).get(
"title", "Original Issue"
),
project_id=project_id,
workspace_id=project.workspace_id,
created_by=request.user,
@@ -483,14 +501,18 @@ class BulkImportModulesEndpoint(BaseAPIView):
ignore_conflicts=True,
)
modules = Module.objects.filter(id__in=[module.id for module in modules])
modules = Module.objects.filter(
id__in=[module.id for module in modules]
)
if len(modules) == len(modules_data):
_ = ModuleLink.objects.bulk_create(
[
ModuleLink(
module=module,
url=module_data.get("link", {}).get("url", "https://plane.so"),
url=module_data.get("link", {}).get(
"url", "https://plane.so"
),
title=module_data.get("link", {}).get(
"title", "Original Issue"
),
@@ -529,6 +551,8 @@ class BulkImportModulesEndpoint(BaseAPIView):
else:
return Response(
{"message": "Modules created but issues could not be imported"},
{
"message": "Modules created but issues could not be imported"
},
status=status.HTTP_200_OK,
)

View File

@@ -62,7 +62,9 @@ class InboxViewSet(BaseViewSet):
serializer.save(project_id=self.kwargs.get("project_id"))
def destroy(self, request, slug, project_id, pk):
inbox = Inbox.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
inbox = Inbox.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
# Handle default inbox delete
if inbox.is_default:
return Response(
@@ -90,7 +92,8 @@ class InboxIssueViewSet(BaseViewSet):
super()
.get_queryset()
.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
Q(snoozed_till__gte=timezone.now())
| Q(snoozed_till__isnull=True),
workspace__slug=self.kwargs.get("slug"),
project_id=self.kwargs.get("project_id"),
inbox_id=self.kwargs.get("inbox_id"),
@@ -107,12 +110,13 @@ class InboxIssueViewSet(BaseViewSet):
project_id=project_id,
)
.filter(**filters)
.annotate(bridge_id=F("issue_inbox__id"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels")
.order_by("issue_inbox__snoozed_till", "issue_inbox__status")
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -124,7 +128,9 @@ class InboxIssueViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -147,7 +153,8 @@ class InboxIssueViewSet(BaseViewSet):
def create(self, request, slug, project_id, inbox_id):
if not request.data.get("issue", {}).get("name", False):
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Name is required"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for valid priority
@@ -159,7 +166,8 @@ class InboxIssueViewSet(BaseViewSet):
"none",
]:
return Response(
{"error": "Invalid priority"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Invalid priority"},
status=status.HTTP_400_BAD_REQUEST,
)
# Create or get state
@@ -204,9 +212,12 @@ class InboxIssueViewSet(BaseViewSet):
serializer = IssueStateInboxSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, inbox_id, pk):
def partial_update(self, request, slug, project_id, inbox_id, issue_id):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox_id,
)
# Get the project member
project_member = ProjectMember.objects.get(
@@ -229,7 +240,9 @@ class InboxIssueViewSet(BaseViewSet):
if bool(issue_data):
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
pk=inbox_issue.issue_id,
workspace__slug=slug,
project_id=project_id,
)
# Only allow guests and viewers to edit name and description
if project_member.role <= 10:
@@ -239,7 +252,9 @@ class InboxIssueViewSet(BaseViewSet):
"description_html": issue_data.get(
"description_html", issue.description_html
),
"description": issue_data.get("description", issue.description),
"description": issue_data.get(
"description", issue.description
),
}
issue_serializer = IssueCreateSerializer(
@@ -285,7 +300,9 @@ class InboxIssueViewSet(BaseViewSet):
project_id=project_id,
)
state = State.objects.filter(
group="cancelled", workspace__slug=slug, project_id=project_id
group="cancelled",
workspace__slug=slug,
project_id=project_id,
).first()
if state is not None:
issue.state = state
@@ -303,32 +320,37 @@ class InboxIssueViewSet(BaseViewSet):
if issue.state.name == "Triage":
# Move to default state
state = State.objects.filter(
workspace__slug=slug, project_id=project_id, default=True
workspace__slug=slug,
project_id=project_id,
default=True,
).first()
if state is not None:
issue.state = state
issue.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
else:
return Response(
InboxIssueSerializer(inbox_issue).data, status=status.HTTP_200_OK
InboxIssueSerializer(inbox_issue).data,
status=status.HTTP_200_OK,
)
def retrieve(self, request, slug, project_id, inbox_id, pk):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
)
def retrieve(self, request, slug, project_id, inbox_id, issue_id):
issue = Issue.objects.get(
pk=inbox_issue.issue_id, workspace__slug=slug, project_id=project_id
pk=issue_id, workspace__slug=slug, project_id=project_id
)
serializer = IssueStateInboxSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, inbox_id, pk):
def destroy(self, request, slug, project_id, inbox_id, issue_id):
inbox_issue = InboxIssue.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id, inbox_id=inbox_id
issue_id=issue_id,
workspace__slug=slug,
project_id=project_id,
inbox_id=inbox_id,
)
# Get the project member
project_member = ProjectMember.objects.get(
@@ -350,9 +372,8 @@ class InboxIssueViewSet(BaseViewSet):
if inbox_issue.status in [-2, -1, 0, 2]:
# Delete the issue also
Issue.objects.filter(
workspace__slug=slug, project_id=project_id, pk=inbox_issue.issue_id
workspace__slug=slug, project_id=project_id, pk=issue_id
).delete()
inbox_issue.delete()
return Response(status=status.HTTP_204_NO_CONTENT)

View File

@@ -1,6 +1,7 @@
# Python improts
import uuid
import requests
# Django imports
from django.contrib.auth.hashers import make_password
@@ -19,7 +20,10 @@ from plane.db.models import (
WorkspaceMember,
APIToken,
)
from plane.app.serializers import IntegrationSerializer, WorkspaceIntegrationSerializer
from plane.app.serializers import (
IntegrationSerializer,
WorkspaceIntegrationSerializer,
)
from plane.utils.integrations.github import (
get_github_metadata,
delete_github_installation,
@@ -27,6 +31,7 @@ from plane.utils.integrations.github import (
from plane.app.permissions import WorkSpaceAdminPermission
from plane.utils.integrations.slack import slack_oauth
class IntegrationViewSet(BaseViewSet):
serializer_class = IntegrationSerializer
model = Integration
@@ -101,7 +106,10 @@ class WorkspaceIntegrationViewSet(BaseViewSet):
code = request.data.get("code", False)
if not code:
return Response({"error": "Code is required"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Code is required"},
status=status.HTTP_400_BAD_REQUEST,
)
slack_response = slack_oauth(code=code)
@@ -110,7 +118,9 @@ class WorkspaceIntegrationViewSet(BaseViewSet):
team_id = metadata.get("team", {}).get("id", False)
if not metadata or not access_token or not team_id:
return Response(
{"error": "Slack could not be installed. Please try again later"},
{
"error": "Slack could not be installed. Please try again later"
},
status=status.HTTP_400_BAD_REQUEST,
)
config = {"team_id": team_id, "access_token": access_token}

View File

@@ -21,7 +21,10 @@ from plane.app.serializers import (
GithubCommentSyncSerializer,
)
from plane.utils.integrations.github import get_github_repos
from plane.app.permissions import ProjectBasePermission, ProjectEntityPermission
from plane.app.permissions import (
ProjectBasePermission,
ProjectEntityPermission,
)
class GithubRepositoriesEndpoint(BaseAPIView):
@@ -185,7 +188,6 @@ class BulkCreateGithubIssueSyncEndpoint(BaseAPIView):
class GithubCommentSyncViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
]

View File

@@ -8,9 +8,16 @@ from sentry_sdk import capture_exception
# Module imports
from plane.app.views import BaseViewSet, BaseAPIView
from plane.db.models import SlackProjectSync, WorkspaceIntegration, ProjectMember
from plane.db.models import (
SlackProjectSync,
WorkspaceIntegration,
ProjectMember,
)
from plane.app.serializers import SlackProjectSyncSerializer
from plane.app.permissions import ProjectBasePermission, ProjectEntityPermission
from plane.app.permissions import (
ProjectBasePermission,
ProjectEntityPermission,
)
from plane.utils.integrations.slack import slack_oauth
@@ -38,7 +45,8 @@ class SlackProjectSyncViewSet(BaseViewSet):
if not code:
return Response(
{"error": "Code is required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Code is required"},
status=status.HTTP_400_BAD_REQUEST,
)
slack_response = slack_oauth(code=code)
@@ -54,7 +62,9 @@ class SlackProjectSyncViewSet(BaseViewSet):
access_token=slack_response.get("access_token"),
scopes=slack_response.get("scope"),
bot_user_id=slack_response.get("bot_user_id"),
webhook_url=slack_response.get("incoming_webhook", {}).get("url"),
webhook_url=slack_response.get("incoming_webhook", {}).get(
"url"
),
data=slack_response,
team_id=slack_response.get("team", {}).get("id"),
team_name=slack_response.get("team", {}).get("name"),
@@ -62,7 +72,9 @@ class SlackProjectSyncViewSet(BaseViewSet):
project_id=project_id,
)
_ = ProjectMember.objects.get_or_create(
member=workspace_integration.actor, role=20, project_id=project_id
member=workspace_integration.actor,
role=20,
project_id=project_id,
)
serializer = SlackProjectSyncSerializer(slack_project_sync)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -74,6 +86,8 @@ class SlackProjectSyncViewSet(BaseViewSet):
)
capture_exception(e)
return Response(
{"error": "Slack could not be installed. Please try again later"},
{
"error": "Slack could not be installed. Please try again later"
},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -34,11 +34,11 @@ from rest_framework.parsers import MultiPartParser, FormParser
# Module imports
from . import BaseViewSet, BaseAPIView, WebhookMixin
from plane.app.serializers import (
IssueCreateSerializer,
IssueActivitySerializer,
IssueCommentSerializer,
IssuePropertySerializer,
IssueSerializer,
IssueCreateSerializer,
LabelSerializer,
IssueFlatSerializer,
IssueLinkSerializer,
@@ -81,7 +81,7 @@ from plane.db.models import (
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import group_results
from plane.utils.issue_filters import issue_filters
from collections import defaultdict
class IssueViewSet(WebhookMixin, BaseViewSet):
def get_serializer_class(self):
@@ -109,13 +109,9 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
def get_queryset(self):
return (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
Issue.issue_objects.filter(
project_id=self.kwargs.get("project_id")
)
.filter(project_id=self.kwargs.get("project_id"))
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("project")
.select_related("workspace")
@@ -129,22 +125,6 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
queryset=IssueReaction.objects.select_related("actor"),
)
)
).distinct()
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
self.get_queryset()
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
@@ -154,17 +134,47 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
).distinct()
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = self.get_queryset().filter(**filters)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -212,14 +222,17 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
issues = IssueSerializer(
issue_queryset, many=True, fields=self.fields, expand=self.expand
).data
return Response(issues, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -239,32 +252,42 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
# Track the issue
issue_activity.delay(
type="issue.activity.created",
requested_data=json.dumps(self.request.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
)
issue = (
self.get_queryset().filter(pk=serializer.data["id"]).first()
)
serializer = IssueSerializer(issue)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def retrieve(self, request, slug, project_id, pk=None):
issue = Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
).get(workspace__slug=slug, project_id=project_id, pk=pk)
return Response(IssueSerializer(issue).data, status=status.HTTP_200_OK)
issue = self.get_queryset().filter(pk=pk).first()
return Response(
IssueSerializer(
issue, fields=self.fields, expand=self.expand
).data,
status=status.HTTP_200_OK,
)
def partial_update(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
serializer = IssueCreateSerializer(issue, data=request.data, partial=True)
serializer = IssueCreateSerializer(
issue, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
@@ -276,11 +299,16 @@ class IssueViewSet(WebhookMixin, BaseViewSet):
current_instance=current_instance,
epoch=int(timezone.now().timestamp()),
)
return Response(serializer.data, status=status.HTTP_200_OK)
issue = self.get_queryset().filter(pk=pk).first()
return Response(
IssueSerializer(issue).data, status=status.HTTP_200_OK
)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)
@@ -303,7 +331,13 @@ class UserWorkSpaceIssues(BaseAPIView):
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
@@ -317,7 +351,9 @@ class UserWorkSpaceIssues(BaseAPIView):
workspace__slug=slug,
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -336,7 +372,9 @@ class UserWorkSpaceIssues(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -353,7 +391,9 @@ class UserWorkSpaceIssues(BaseAPIView):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -401,7 +441,9 @@ class UserWorkSpaceIssues(BaseAPIView):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
@@ -470,7 +512,9 @@ class IssueActivityEndpoint(BaseAPIView):
)
)
)
issue_activities = IssueActivitySerializer(issue_activities, many=True).data
issue_activities = IssueActivitySerializer(
issue_activities, many=True
).data
issue_comments = IssueCommentSerializer(issue_comments, many=True).data
result_list = sorted(
@@ -528,7 +572,9 @@ class IssueCommentViewSet(WebhookMixin, BaseViewSet):
)
issue_activity.delay(
type="comment.activity.created",
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
serializer.data, cls=DjangoJSONEncoder
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -540,7 +586,10 @@ class IssueCommentViewSet(WebhookMixin, BaseViewSet):
def partial_update(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
requested_data = json.dumps(self.request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
@@ -566,7 +615,10 @@ class IssueCommentViewSet(WebhookMixin, BaseViewSet):
def destroy(self, request, slug, project_id, issue_id, pk):
issue_comment = IssueComment.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
current_instance = json.dumps(
IssueCommentSerializer(issue_comment).data,
@@ -590,16 +642,21 @@ class IssueUserDisplayPropertyEndpoint(BaseAPIView):
ProjectLitePermission,
]
def post(self, request, slug, project_id):
issue_property, created = IssueProperty.objects.get_or_create(
def patch(self, request, slug, project_id):
issue_property = IssueProperty.objects.get(
user=request.user,
project_id=project_id,
)
if not created:
issue_property.properties = request.data.get("properties", {})
issue_property.save()
issue_property.properties = request.data.get("properties", {})
issue_property.filters = request.data.get(
"filters", issue_property.filters
)
issue_property.display_filters = request.data.get(
"display_filters", issue_property.display_filters
)
issue_property.display_properties = request.data.get(
"display_properties", issue_property.display_properties
)
issue_property.save()
serializer = IssuePropertySerializer(issue_property)
return Response(serializer.data, status=status.HTTP_201_CREATED)
@@ -624,11 +681,17 @@ class LabelViewSet(BaseViewSet):
serializer = LabelSerializer(data=request.data)
if serializer.is_valid():
serializer.save(project_id=project_id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError:
return Response(
{"error": "Label with the same name already exists in the project"},
{
"error": "Label with the same name already exists in the project"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -683,7 +746,9 @@ class SubIssuesEndpoint(BaseAPIView):
@method_decorator(gzip_page)
def get(self, request, slug, project_id, issue_id):
sub_issues = (
Issue.issue_objects.filter(parent_id=issue_id, workspace__slug=slug)
Issue.issue_objects.filter(
parent_id=issue_id, workspace__slug=slug
)
.select_related("project")
.select_related("workspace")
.select_related("state")
@@ -691,7 +756,9 @@ class SubIssuesEndpoint(BaseAPIView):
.prefetch_related("assignees")
.prefetch_related("labels")
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -703,7 +770,9 @@ class SubIssuesEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -714,21 +783,15 @@ class SubIssuesEndpoint(BaseAPIView):
queryset=IssueReaction.objects.select_related("actor"),
)
)
.annotate(state_group=F("state__group"))
)
state_distribution = (
State.objects.filter(workspace__slug=slug, state_issue__parent_id=issue_id)
.annotate(state_group=F("group"))
.values("state_group")
.annotate(state_count=Count("state_group"))
.order_by("state_group")
)
# create's a dict with state group name with their respective issue id's
result = defaultdict(list)
for sub_issue in sub_issues:
result[sub_issue.state_group].append(str(sub_issue.id))
result = {
item["state_group"]: item["state_count"] for item in state_distribution
}
serializer = IssueLiteSerializer(
serializer = IssueSerializer(
sub_issues,
many=True,
)
@@ -758,7 +821,7 @@ class SubIssuesEndpoint(BaseAPIView):
_ = Issue.objects.bulk_update(sub_issues, ["parent"], batch_size=10)
updated_sub_issues = Issue.issue_objects.filter(id__in=sub_issue_ids)
updated_sub_issues = Issue.issue_objects.filter(id__in=sub_issue_ids).annotate(state_group=F("state__group"))
# Track the issue
_ = [
@@ -774,12 +837,25 @@ class SubIssuesEndpoint(BaseAPIView):
for sub_issue_id in sub_issue_ids
]
# create's a dict with state group name with their respective issue id's
result = defaultdict(list)
for sub_issue in updated_sub_issues:
result[sub_issue.state_group].append(str(sub_issue.id))
serializer = IssueSerializer(
updated_sub_issues,
many=True,
)
return Response(
IssueFlatSerializer(updated_sub_issues, many=True).data,
{
"sub_issues": serializer.data,
"state_distribution": result,
},
status=status.HTTP_200_OK,
)
class IssueLinkViewSet(BaseViewSet):
permission_classes = [
ProjectEntityPermission,
@@ -809,7 +885,9 @@ class IssueLinkViewSet(BaseViewSet):
)
issue_activity.delay(
type="link.activity.created",
requested_data=json.dumps(serializer.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
serializer.data, cls=DjangoJSONEncoder
),
actor_id=str(self.request.user.id),
issue_id=str(self.kwargs.get("issue_id")),
project_id=str(self.kwargs.get("project_id")),
@@ -821,14 +899,19 @@ class IssueLinkViewSet(BaseViewSet):
def partial_update(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
requested_data = json.dumps(request.data, cls=DjangoJSONEncoder)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
cls=DjangoJSONEncoder,
)
serializer = IssueLinkSerializer(issue_link, data=request.data, partial=True)
serializer = IssueLinkSerializer(
issue_link, data=request.data, partial=True
)
if serializer.is_valid():
serializer.save()
issue_activity.delay(
@@ -845,7 +928,10 @@ class IssueLinkViewSet(BaseViewSet):
def destroy(self, request, slug, project_id, issue_id, pk):
issue_link = IssueLink.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
pk=pk,
)
current_instance = json.dumps(
IssueLinkSerializer(issue_link).data,
@@ -971,13 +1057,23 @@ class IssueArchiveViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
filters = issue_filters(request.query_params, "GET")
show_sub_issues = request.GET.get("show_sub_issues", "true")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
@@ -993,7 +1089,9 @@ class IssueArchiveViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -1003,7 +1101,9 @@ class IssueArchiveViewSet(BaseViewSet):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -1051,7 +1151,9 @@ class IssueArchiveViewSet(BaseViewSet):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
@@ -1062,9 +1164,10 @@ class IssueArchiveViewSet(BaseViewSet):
else issue_queryset.filter(parent__isnull=True)
)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
issues = IssueSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
return Response(issues, status=status.HTTP_200_OK)
def retrieve(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(
@@ -1138,24 +1241,11 @@ class IssueSubscriberViewSet(BaseViewSet):
)
def list(self, request, slug, project_id, issue_id):
members = (
ProjectMember.objects.filter(
members = ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
subscriber=OuterRef("member"),
)
)
)
.select_related("member")
)
).select_related("member")
serializer = ProjectMemberLiteSerializer(members, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -1210,7 +1300,9 @@ class IssueSubscriberViewSet(BaseViewSet):
workspace__slug=slug,
project=project_id,
).exists()
return Response({"subscribed": issue_subscriber}, status=status.HTTP_200_OK)
return Response(
{"subscribed": issue_subscriber}, status=status.HTTP_200_OK
)
class IssueReactionViewSet(BaseViewSet):
@@ -1365,23 +1457,95 @@ class IssueRelationViewSet(BaseViewSet):
.distinct()
)
def list(self, request, slug, project_id, issue_id):
issue_relations = (
IssueRelation.objects.filter(
Q(issue_id=issue_id) | Q(related_issue=issue_id)
)
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("project")
.select_related("workspace")
.select_related("issue")
.order_by("-created_at")
.distinct()
)
blocking_issues = issue_relations.filter(
relation_type="blocked_by", related_issue_id=issue_id
)
blocked_by_issues = issue_relations.filter(
relation_type="blocked_by", issue_id=issue_id
)
duplicate_issues = issue_relations.filter(
issue_id=issue_id, relation_type="duplicate"
)
duplicate_issues_related = issue_relations.filter(
related_issue_id=issue_id, relation_type="duplicate"
)
relates_to_issues = issue_relations.filter(
issue_id=issue_id, relation_type="relates_to"
)
relates_to_issues_related = issue_relations.filter(
related_issue_id=issue_id, relation_type="relates_to"
)
blocked_by_issues_serialized = IssueRelationSerializer(
blocked_by_issues, many=True
).data
duplicate_issues_serialized = IssueRelationSerializer(
duplicate_issues, many=True
).data
relates_to_issues_serialized = IssueRelationSerializer(
relates_to_issues, many=True
).data
# revere relation for blocked by issues
blocking_issues_serialized = RelatedIssueSerializer(
blocking_issues, many=True
).data
# reverse relation for duplicate issues
duplicate_issues_related_serialized = RelatedIssueSerializer(
duplicate_issues_related, many=True
).data
# reverse relation for related issues
relates_to_issues_related_serialized = RelatedIssueSerializer(
relates_to_issues_related, many=True
).data
response_data = {
"blocking": blocking_issues_serialized,
"blocked_by": blocked_by_issues_serialized,
"duplicate": duplicate_issues_serialized
+ duplicate_issues_related_serialized,
"relates_to": relates_to_issues_serialized
+ relates_to_issues_related_serialized,
}
return Response(response_data, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, issue_id):
related_list = request.data.get("related_list", [])
relation = request.data.get("relation", None)
relation_type = request.data.get("relation_type", None)
issues = request.data.get("issues", [])
project = Project.objects.get(pk=project_id)
issue_relation = IssueRelation.objects.bulk_create(
[
IssueRelation(
issue_id=related_issue["issue"],
related_issue_id=related_issue["related_issue"],
relation_type=related_issue["relation_type"],
issue_id=issue
if relation_type == "blocking"
else issue_id,
related_issue_id=issue_id
if relation_type == "blocking"
else issue,
relation_type="blocked_by"
if relation_type == "blocking"
else relation_type,
project_id=project_id,
workspace_id=project.workspace_id,
created_by=request.user,
updated_by=request.user,
)
for related_issue in related_list
for issue in issues
],
batch_size=10,
ignore_conflicts=True,
@@ -1397,7 +1561,7 @@ class IssueRelationViewSet(BaseViewSet):
epoch=int(timezone.now().timestamp()),
)
if relation == "blocking":
if relation_type == "blocking":
return Response(
RelatedIssueSerializer(issue_relation, many=True).data,
status=status.HTTP_201_CREATED,
@@ -1408,9 +1572,23 @@ class IssueRelationViewSet(BaseViewSet):
status=status.HTTP_201_CREATED,
)
def destroy(self, request, slug, project_id, issue_id, pk):
def remove_relation(self, request, slug, project_id, issue_id):
relation_type = request.data.get("relation_type", None)
related_issue = request.data.get("related_issue", None)
if relation_type == "blocking":
issue_relation = IssueRelation.objects.get(
workspace__slug=slug, project_id=project_id, issue_id=issue_id, pk=pk
workspace__slug=slug,
project_id=project_id,
issue_id=related_issue,
related_issue_id=issue_id,
)
else:
issue_relation = IssueRelation.objects.get(
workspace__slug=slug,
project_id=project_id,
issue_id=issue_id,
related_issue_id=related_issue,
)
current_instance = json.dumps(
IssueRelationSerializer(issue_relation).data,
@@ -1419,7 +1597,7 @@ class IssueRelationViewSet(BaseViewSet):
issue_relation.delete()
issue_activity.delay(
type="issue_relation.activity.deleted",
requested_data=json.dumps({"related_list": None}),
requested_data=json.dumps(request.data, cls=DjangoJSONEncoder),
actor_id=str(request.user.id),
issue_id=str(issue_id),
project_id=str(project_id),
@@ -1439,7 +1617,9 @@ class IssueDraftViewSet(BaseViewSet):
def get_queryset(self):
return (
Issue.objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -1464,11 +1644,21 @@ class IssueDraftViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
@@ -1484,7 +1674,9 @@ class IssueDraftViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -1494,7 +1686,9 @@ class IssueDraftViewSet(BaseViewSet):
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -1542,14 +1736,17 @@ class IssueDraftViewSet(BaseViewSet):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
issues = IssueSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
return Response(issues, status=status.HTTP_200_OK)
def create(self, request, slug, project_id):
project = Project.objects.get(pk=project_id)
@@ -1569,7 +1766,9 @@ class IssueDraftViewSet(BaseViewSet):
# Track the issue
issue_activity.delay(
type="issue_draft.activity.created",
requested_data=json.dumps(self.request.data, cls=DjangoJSONEncoder),
requested_data=json.dumps(
self.request.data, cls=DjangoJSONEncoder
),
actor_id=str(request.user.id),
issue_id=str(serializer.data.get("id", None)),
project_id=str(project_id),
@@ -1580,14 +1779,18 @@ class IssueDraftViewSet(BaseViewSet):
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def partial_update(self, request, slug, project_id, pk):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
serializer = IssueSerializer(issue, data=request.data, partial=True)
if serializer.is_valid():
if request.data.get("is_draft") is not None and not request.data.get(
if request.data.get(
"is_draft"
):
serializer.save(created_at=timezone.now(), updated_at=timezone.now())
) is not None and not request.data.get("is_draft"):
serializer.save(
created_at=timezone.now(), updated_at=timezone.now()
)
else:
serializer.save()
issue_activity.delay(
@@ -1612,7 +1815,9 @@ class IssueDraftViewSet(BaseViewSet):
return Response(IssueSerializer(issue).data, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, pk=None):
issue = Issue.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
issue = Issue.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
current_instance = json.dumps(
IssueSerializer(issue).data, cls=DjangoJSONEncoder
)

View File

@@ -20,9 +20,13 @@ from plane.app.serializers import (
ModuleIssueSerializer,
ModuleLinkSerializer,
ModuleFavoriteSerializer,
IssueStateSerializer,
IssueSerializer,
ModuleUserPropertiesSerializer,
)
from plane.app.permissions import (
ProjectEntityPermission,
ProjectLitePermission,
)
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
Module,
ModuleIssue,
@@ -32,6 +36,8 @@ from plane.db.models import (
ModuleFavorite,
IssueLink,
IssueAttachment,
IssueSubscriber,
ModuleUserProperties,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.grouper import group_results
@@ -54,7 +60,6 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
)
def get_queryset(self):
subquery = ModuleFavorite.objects.filter(
user=self.request.user,
module_id=OuterRef("pk"),
@@ -74,7 +79,9 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
.prefetch_related(
Prefetch(
"link_module",
queryset=ModuleLink.objects.select_related("module", "created_by"),
queryset=ModuleLink.objects.select_related(
"module", "created_by"
),
)
)
.annotate(
@@ -136,7 +143,7 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
),
)
)
.order_by("-is_favorite","-created_at")
.order_by("-is_favorite", "-created_at")
)
def create(self, request, slug, project_id):
@@ -153,6 +160,18 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
def list(self, request, slug, project_id):
queryset = self.get_queryset()
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
modules = ModuleSerializer(
queryset, many=True, fields=fields if fields else None
).data
return Response(modules, status=status.HTTP_200_OK)
def retrieve(self, request, slug, project_id, pk):
queryset = self.get_queryset().get(pk=pk)
@@ -167,7 +186,13 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
.annotate(assignee_id=F("assignees__id"))
.annotate(display_name=F("assignees__display_name"))
.annotate(avatar=F("assignees__avatar"))
.values("first_name", "last_name", "assignee_id", "avatar", "display_name")
.values(
"first_name",
"last_name",
"assignee_id",
"avatar",
"display_name",
)
.annotate(
total_issues=Count(
"assignee_id",
@@ -251,7 +276,10 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
if queryset.start_date and queryset.target_date:
data["distribution"]["completion_chart"] = burndown_plot(
queryset=queryset, slug=slug, project_id=project_id, module_id=pk
queryset=queryset,
slug=slug,
project_id=project_id,
module_id=pk,
)
return Response(
@@ -260,9 +288,13 @@ class ModuleViewSet(WebhookMixin, BaseViewSet):
)
def destroy(self, request, slug, project_id, pk):
module = Module.objects.get(workspace__slug=slug, project_id=project_id, pk=pk)
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=pk
)
module_issues = list(
ModuleIssue.objects.filter(module_id=pk).values_list("issue", flat=True)
ModuleIssue.objects.filter(module_id=pk).values_list(
"issue", flat=True
)
)
issue_activity.delay(
type="module.activity.deleted",
@@ -289,7 +321,6 @@ class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
webhook_event = "module_issue"
bulk = True
filterset_fields = [
"issue__labels__id",
"issue__assignees__id",
@@ -304,7 +335,9 @@ class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
super()
.get_queryset()
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("issue"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("issue")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -324,18 +357,23 @@ class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug, project_id, module_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
order_by = request.GET.get("order_by", "created_at")
filters = issue_filters(request.query_params, "GET")
issues = (
Issue.issue_objects.filter(issue_module__module_id=module_id)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(bridge_id=F("issue_module__id"))
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project")
@@ -346,6 +384,8 @@ class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
.prefetch_related("labels")
.order_by(order_by)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -353,21 +393,32 @@ class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
subscriber=self.request.user, issue_id=OuterRef("id")
)
issues = IssueStateSerializer(issues, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
)
)
)
serializer = IssueSerializer(
issues, many=True, fields=fields if fields else None
)
return Response(serializer.data, status=status.HTTP_200_OK)
def create(self, request, slug, project_id, module_id):
issues = request.data.get("issues", [])
if not len(issues):
return Response(
{"error": "Issues are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Issues are required"},
status=status.HTTP_400_BAD_REQUEST,
)
module = Module.objects.get(
workspace__slug=slug, project_id=project_id, pk=module_id
@@ -439,25 +490,32 @@ class ModuleIssueViewSet(WebhookMixin, BaseViewSet):
epoch=int(timezone.now().timestamp()),
)
issues = self.get_queryset().values_list("issue_id", flat=True)
return Response(
ModuleIssueSerializer(self.get_queryset(), many=True).data,
IssueSerializer(
Issue.objects.filter(pk__in=issues), many=True
).data,
status=status.HTTP_200_OK,
)
def destroy(self, request, slug, project_id, module_id, pk):
def destroy(self, request, slug, project_id, module_id, issue_id):
module_issue = ModuleIssue.objects.get(
workspace__slug=slug, project_id=project_id, module_id=module_id, pk=pk
workspace__slug=slug,
project_id=project_id,
module_id=module_id,
issue_id=issue_id,
)
issue_activity.delay(
type="module.activity.deleted",
requested_data=json.dumps(
{
"module_id": str(module_id),
"issues": [str(module_issue.issue_id)],
"issues": [str(issue_id)],
}
),
actor_id=str(request.user.id),
issue_id=str(module_issue.issue_id),
issue_id=str(issue_id),
project_id=str(project_id),
current_instance=None,
epoch=int(timezone.now().timestamp()),
@@ -522,3 +580,41 @@ class ModuleFavoriteViewSet(BaseViewSet):
)
module_favorite.delete()
return Response(status=status.HTTP_204_NO_CONTENT)
class ModuleUserPropertiesEndpoint(BaseAPIView):
permission_classes = [
ProjectLitePermission,
]
def patch(self, request, slug, project_id, module_id):
module_properties = ModuleUserProperties.objects.get(
user=request.user,
module_id=module_id,
project_id=project_id,
workspace__slug=slug,
)
module_properties.filters = request.data.get(
"filters", module_properties.filters
)
module_properties.display_filters = request.data.get(
"display_filters", module_properties.display_filters
)
module_properties.display_properties = request.data.get(
"display_properties", module_properties.display_properties
)
module_properties.save()
serializer = ModuleUserPropertiesSerializer(module_properties)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request, slug, project_id, module_id):
module_properties, _ = ModuleUserProperties.objects.get_or_create(
user=request.user,
project_id=project_id,
module_id=module_id,
workspace__slug=slug,
)
serializer = ModuleUserPropertiesSerializer(module_properties)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -51,8 +51,10 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
# Filters based on query parameters
snoozed_filters = {
"true": Q(snoozed_till__lt=timezone.now()) | Q(snoozed_till__isnull=False),
"false": Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
"true": Q(snoozed_till__lt=timezone.now())
| Q(snoozed_till__isnull=False),
"false": Q(snoozed_till__gte=timezone.now())
| Q(snoozed_till__isnull=True),
}
notifications = notifications.filter(snoozed_filters[snoozed])
@@ -72,14 +74,18 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
issue_ids = IssueSubscriber.objects.filter(
workspace__slug=slug, subscriber_id=request.user.id
).values_list("issue_id", flat=True)
notifications = notifications.filter(entity_identifier__in=issue_ids)
notifications = notifications.filter(
entity_identifier__in=issue_ids
)
# Assigned Issues
if type == "assigned":
issue_ids = IssueAssignee.objects.filter(
workspace__slug=slug, assignee_id=request.user.id
).values_list("issue_id", flat=True)
notifications = notifications.filter(entity_identifier__in=issue_ids)
notifications = notifications.filter(
entity_identifier__in=issue_ids
)
# Created issues
if type == "created":
@@ -94,10 +100,14 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
issue_ids = Issue.objects.filter(
workspace__slug=slug, created_by=request.user
).values_list("pk", flat=True)
notifications = notifications.filter(entity_identifier__in=issue_ids)
notifications = notifications.filter(
entity_identifier__in=issue_ids
)
# Pagination
if request.GET.get("per_page", False) and request.GET.get("cursor", False):
if request.GET.get("per_page", False) and request.GET.get(
"cursor", False
):
return self.paginate(
request=request,
queryset=(notifications),
@@ -227,11 +237,13 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
# Filter for snoozed notifications
if snoozed:
notifications = notifications.filter(
Q(snoozed_till__lt=timezone.now()) | Q(snoozed_till__isnull=False)
Q(snoozed_till__lt=timezone.now())
| Q(snoozed_till__isnull=False)
)
else:
notifications = notifications.filter(
Q(snoozed_till__gte=timezone.now()) | Q(snoozed_till__isnull=True),
Q(snoozed_till__gte=timezone.now())
| Q(snoozed_till__isnull=True),
)
# Filter for archived or unarchive
@@ -245,14 +257,18 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
issue_ids = IssueSubscriber.objects.filter(
workspace__slug=slug, subscriber_id=request.user.id
).values_list("issue_id", flat=True)
notifications = notifications.filter(entity_identifier__in=issue_ids)
notifications = notifications.filter(
entity_identifier__in=issue_ids
)
# Assigned Issues
if type == "assigned":
issue_ids = IssueAssignee.objects.filter(
workspace__slug=slug, assignee_id=request.user.id
).values_list("issue_id", flat=True)
notifications = notifications.filter(entity_identifier__in=issue_ids)
notifications = notifications.filter(
entity_identifier__in=issue_ids
)
# Created issues
if type == "created":
@@ -267,7 +283,9 @@ class MarkAllReadNotificationViewSet(BaseViewSet):
issue_ids = Issue.objects.filter(
workspace__slug=slug, created_by=request.user
).values_list("pk", flat=True)
notifications = notifications.filter(entity_identifier__in=issue_ids)
notifications = notifications.filter(
entity_identifier__in=issue_ids
)
updated_notifications = []
for notification in notifications:

View File

@@ -1,5 +1,5 @@
# Python imports
from datetime import timedelta, date, datetime
from datetime import date, datetime, timedelta
# Django imports
from django.db import connection
@@ -7,30 +7,19 @@ from django.db.models import Exists, OuterRef, Q
from django.utils import timezone
from django.utils.decorators import method_decorator
from django.views.decorators.gzip import gzip_page
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from .base import BaseViewSet, BaseAPIView
from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
Page,
PageFavorite,
Issue,
IssueAssignee,
IssueActivity,
PageLog,
ProjectMember,
)
from plane.app.serializers import (
PageSerializer,
PageFavoriteSerializer,
PageLogSerializer,
IssueLiteSerializer,
SubPageSerializer,
)
from plane.app.serializers import (IssueLiteSerializer, PageFavoriteSerializer,
PageLogSerializer, PageSerializer,
SubPageSerializer)
from plane.db.models import (Issue, IssueActivity, IssueAssignee, Page,
PageFavorite, PageLog, ProjectMember)
# Module imports
from .base import BaseAPIView, BaseViewSet
def unarchive_archive_page_and_descendants(page_id, archived_at):
@@ -97,7 +86,9 @@ class PageViewSet(BaseViewSet):
def partial_update(self, request, slug, project_id, pk):
try:
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
page = Page.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
if page.is_locked:
return Response(
@@ -127,7 +118,9 @@ class PageViewSet(BaseViewSet):
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except Page.DoesNotExist:
return Response(
{
@@ -157,17 +150,21 @@ class PageViewSet(BaseViewSet):
def list(self, request, slug, project_id):
queryset = self.get_queryset().filter(archived_at__isnull=True)
return Response(
PageSerializer(queryset, many=True).data, status=status.HTTP_200_OK
)
pages = PageSerializer(queryset, many=True).data
return Response(pages, status=status.HTTP_200_OK)
def archive(self, request, slug, project_id, page_id):
page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
page = Page.objects.get(
pk=page_id, workspace__slug=slug, project_id=project_id
)
# only the owner and admin can archive the page
if (
ProjectMember.objects.filter(
project_id=project_id, member=request.user, is_active=True, role__gt=20
project_id=project_id,
member=request.user,
is_active=True,
role__gte=20,
).exists()
or request.user.id != page.owned_by_id
):
@@ -181,12 +178,17 @@ class PageViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
def unarchive(self, request, slug, project_id, page_id):
page = Page.objects.get(pk=page_id, workspace__slug=slug, project_id=project_id)
page = Page.objects.get(
pk=page_id, workspace__slug=slug, project_id=project_id
)
# only the owner and admin can un archive the page
if (
ProjectMember.objects.filter(
project_id=project_id, member=request.user, is_active=True, role__gt=20
project_id=project_id,
member=request.user,
is_active=True,
role__gt=20,
).exists()
or request.user.id != page.owned_by_id
):
@@ -210,17 +212,21 @@ class PageViewSet(BaseViewSet):
workspace__slug=slug,
).filter(archived_at__isnull=False)
return Response(
PageSerializer(pages, many=True).data, status=status.HTTP_200_OK
)
pages = PageSerializer(pages, many=True).data
return Response(pages, status=status.HTTP_200_OK)
def destroy(self, request, slug, project_id, pk):
page = Page.objects.get(pk=pk, workspace__slug=slug, project_id=project_id)
page = Page.objects.get(
pk=pk, workspace__slug=slug, project_id=project_id
)
# only the owner and admin can delete the page
if (
ProjectMember.objects.filter(
project_id=project_id, member=request.user, is_active=True, role__gt=20
project_id=project_id,
member=request.user,
is_active=True,
role__gt=20,
).exists()
or request.user.id != page.owned_by_id
):

View File

@@ -36,6 +36,7 @@ from plane.app.serializers import (
ProjectFavoriteSerializer,
ProjectDeployBoardSerializer,
ProjectMemberAdminSerializer,
ProjectMemberRoleSerializer,
)
from plane.app.permissions import (
@@ -85,9 +86,15 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(Q(project_projectmember__member=self.request.user) | Q(network=2))
.filter(
Q(project_projectmember__member=self.request.user)
| Q(network=2)
)
.select_related(
"workspace", "workspace__owner", "default_assignee", "project_lead"
"workspace",
"workspace__owner",
"default_assignee",
"project_lead",
)
.annotate(
is_favorite=Exists(
@@ -159,7 +166,11 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
)
def list(self, request, slug):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
sort_order_query = ProjectMember.objects.filter(
member=request.user,
@@ -172,7 +183,9 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
.annotate(sort_order=Subquery(sort_order_query))
.order_by("sort_order", "name")
)
if request.GET.get("per_page", False) and request.GET.get("cursor", False):
if request.GET.get("per_page", False) and request.GET.get(
"cursor", False
):
return self.paginate(
request=request,
queryset=(projects),
@@ -180,12 +193,10 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
projects, many=True
).data,
)
return Response(
ProjectListSerializer(
projects = ProjectListSerializer(
projects, many=True, fields=fields if fields else None
).data
)
return Response(projects, status=status.HTTP_200_OK)
def create(self, request, slug):
try:
@@ -199,7 +210,9 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
# Add the user as Administrator to the project
project_member = ProjectMember.objects.create(
project_id=serializer.data["id"], member=request.user, role=20
project_id=serializer.data["id"],
member=request.user,
role=20,
)
# Also create the issue property for the user
_ = IssueProperty.objects.create(
@@ -272,9 +285,15 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
]
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
project = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors,
status=status.HTTP_400_BAD_REQUEST,
@@ -287,7 +306,8 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
)
except Workspace.DoesNotExist as e:
return Response(
{"error": "Workspace does not exist"}, status=status.HTTP_404_NOT_FOUND
{"error": "Workspace does not exist"},
status=status.HTTP_404_NOT_FOUND,
)
except serializers.ValidationError as e:
return Response(
@@ -312,7 +332,9 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
serializer.save()
if serializer.data["inbox_view"]:
Inbox.objects.get_or_create(
name=f"{project.name} Inbox", project=project, is_default=True
name=f"{project.name} Inbox",
project=project,
is_default=True,
)
# Create the triage state in Backlog group
@@ -324,10 +346,16 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
color="#ff7700",
)
project = self.get_queryset().filter(pk=serializer.data["id"]).first()
project = (
self.get_queryset()
.filter(pk=serializer.data["id"])
.first()
)
serializer = ProjectListSerializer(project)
return Response(serializer.data, status=status.HTTP_200_OK)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError as e:
if "already exists" in str(e):
@@ -337,7 +365,8 @@ class ProjectViewSet(WebhookMixin, BaseViewSet):
)
except (Project.DoesNotExist, Workspace.DoesNotExist):
return Response(
{"error": "Project does not exist"}, status=status.HTTP_404_NOT_FOUND
{"error": "Project does not exist"},
status=status.HTTP_404_NOT_FOUND,
)
except serializers.ValidationError as e:
return Response(
@@ -372,11 +401,14 @@ class ProjectInvitationsViewset(BaseViewSet):
# Check if email is provided
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Emails are required"},
status=status.HTTP_400_BAD_REQUEST,
)
requesting_user = ProjectMember.objects.get(
workspace__slug=slug, project_id=project_id, member_id=request.user.id
workspace__slug=slug,
project_id=project_id,
member_id=request.user.id,
)
# Check if any invited user has an higher role
@@ -550,7 +582,9 @@ class ProjectJoinEndpoint(BaseAPIView):
_ = WorkspaceMember.objects.create(
workspace_id=project_invite.workspace_id,
member=user,
role=15 if project_invite.role >= 15 else project_invite.role,
role=15
if project_invite.role >= 15
else project_invite.role,
)
else:
# Else make him active
@@ -660,7 +694,8 @@ class ProjectMemberViewSet(BaseViewSet):
sort_order = [
project_member.get("sort_order")
for project_member in project_members
if str(project_member.get("member_id")) == str(member.get("member_id"))
if str(project_member.get("member_id"))
== str(member.get("member_id"))
]
bulk_project_members.append(
ProjectMember(
@@ -668,7 +703,9 @@ class ProjectMemberViewSet(BaseViewSet):
role=member.get("role", 10),
project_id=project_id,
workspace_id=project.workspace_id,
sort_order=sort_order[0] - 10000 if len(sort_order) else 65535,
sort_order=sort_order[0] - 10000
if len(sort_order)
else 65535,
)
)
bulk_issue_props.append(
@@ -713,13 +750,7 @@ class ProjectMemberViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_201_CREATED)
def list(self, request, slug, project_id):
project_member = ProjectMember.objects.get(
member=request.user,
workspace__slug=slug,
project_id=project_id,
is_active=True,
)
# Get the list of project members for the project
project_members = ProjectMember.objects.filter(
project_id=project_id,
workspace__slug=slug,
@@ -727,10 +758,9 @@ class ProjectMemberViewSet(BaseViewSet):
is_active=True,
).select_related("project", "member", "workspace")
if project_member.role > 10:
serializer = ProjectMemberAdminSerializer(project_members, many=True)
else:
serializer = ProjectMemberSerializer(project_members, many=True)
serializer = ProjectMemberRoleSerializer(
project_members, fields=("id", "member", "role"), many=True
)
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, pk):
@@ -758,7 +788,9 @@ class ProjectMemberViewSet(BaseViewSet):
> requested_project_member.role
):
return Response(
{"error": "You cannot update a role that is higher than your own role"},
{
"error": "You cannot update a role that is higher than your own role"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -797,7 +829,9 @@ class ProjectMemberViewSet(BaseViewSet):
# User cannot deactivate higher role
if requesting_project_member.role < project_member.role:
return Response(
{"error": "You cannot remove a user having role higher than you"},
{
"error": "You cannot remove a user having role higher than you"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -848,7 +882,8 @@ class AddTeamToProjectEndpoint(BaseAPIView):
if len(team_members) == 0:
return Response(
{"error": "No such team exists"}, status=status.HTTP_400_BAD_REQUEST
{"error": "No such team exists"},
status=status.HTTP_400_BAD_REQUEST,
)
workspace = Workspace.objects.get(slug=slug)
@@ -895,7 +930,8 @@ class ProjectIdentifierEndpoint(BaseAPIView):
if name == "":
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Name is required"},
status=status.HTTP_400_BAD_REQUEST,
)
exists = ProjectIdentifier.objects.filter(
@@ -912,16 +948,23 @@ class ProjectIdentifierEndpoint(BaseAPIView):
if name == "":
return Response(
{"error": "Name is required"}, status=status.HTTP_400_BAD_REQUEST
)
if Project.objects.filter(identifier=name, workspace__slug=slug).exists():
return Response(
{"error": "Cannot delete an identifier of an existing project"},
{"error": "Name is required"},
status=status.HTTP_400_BAD_REQUEST,
)
ProjectIdentifier.objects.filter(name=name, workspace__slug=slug).delete()
if Project.objects.filter(
identifier=name, workspace__slug=slug
).exists():
return Response(
{
"error": "Cannot delete an identifier of an existing project"
},
status=status.HTTP_400_BAD_REQUEST,
)
ProjectIdentifier.objects.filter(
name=name, workspace__slug=slug
).delete()
return Response(
status=status.HTTP_204_NO_CONTENT,
@@ -939,7 +982,9 @@ class ProjectUserViewsEndpoint(BaseAPIView):
).first()
if project_member is None:
return Response({"error": "Forbidden"}, status=status.HTTP_403_FORBIDDEN)
return Response(
{"error": "Forbidden"}, status=status.HTTP_403_FORBIDDEN
)
view_props = project_member.view_props
default_props = project_member.default_props
@@ -947,8 +992,12 @@ class ProjectUserViewsEndpoint(BaseAPIView):
sort_order = project_member.sort_order
project_member.view_props = request.data.get("view_props", view_props)
project_member.default_props = request.data.get("default_props", default_props)
project_member.preferences = request.data.get("preferences", preferences)
project_member.default_props = request.data.get(
"default_props", default_props
)
project_member.preferences = request.data.get(
"preferences", preferences
)
project_member.sort_order = request.data.get("sort_order", sort_order)
project_member.save()
@@ -1010,18 +1059,11 @@ class ProjectPublicCoverImagesEndpoint(BaseAPIView):
def get(self, request):
files = []
s3_client_params = {
"service_name": "s3",
"aws_access_key_id": settings.AWS_ACCESS_KEY_ID,
"aws_secret_access_key": settings.AWS_SECRET_ACCESS_KEY,
}
# Use AWS_S3_ENDPOINT_URL if it is present in the settings
if hasattr(settings, "AWS_S3_ENDPOINT_URL") and settings.AWS_S3_ENDPOINT_URL:
s3_client_params["endpoint_url"] = settings.AWS_S3_ENDPOINT_URL
s3 = boto3.client(**s3_client_params)
s3 = boto3.client(
"s3",
aws_access_key_id=settings.AWS_ACCESS_KEY_ID,
aws_secret_access_key=settings.AWS_SECRET_ACCESS_KEY,
)
params = {
"Bucket": settings.AWS_STORAGE_BUCKET_NAME,
"Prefix": "static/project-cover/",
@@ -1034,16 +1076,6 @@ class ProjectPublicCoverImagesEndpoint(BaseAPIView):
if not content["Key"].endswith(
"/"
): # This line ensures we're only getting files, not "sub-folders"
if (
hasattr(settings, "AWS_S3_CUSTOM_DOMAIN")
and settings.AWS_S3_CUSTOM_DOMAIN
and hasattr(settings, "AWS_S3_URL_PROTOCOL")
and settings.AWS_S3_URL_PROTOCOL
):
files.append(
f"{settings.AWS_S3_URL_PROTOCOL}//{settings.AWS_S3_CUSTOM_DOMAIN}/{content['Key']}"
)
else:
files.append(
f"https://{settings.AWS_STORAGE_BUCKET_NAME}.s3.{settings.AWS_REGION}.amazonaws.com/{content['Key']}"
)
@@ -1113,6 +1145,7 @@ class UserProjectRolesEndpoint(BaseAPIView):
).values("project_id", "role")
project_members = {
str(member["project_id"]): member["role"] for member in project_members
str(member["project_id"]): member["role"]
for member in project_members
}
return Response(project_members, status=status.HTTP_200_OK)

View File

@@ -10,7 +10,15 @@ from rest_framework.response import Response
# Module imports
from .base import BaseAPIView
from plane.db.models import Workspace, Project, Issue, Cycle, Module, Page, IssueView
from plane.db.models import (
Workspace,
Project,
Issue,
Cycle,
Module,
Page,
IssueView,
)
from plane.utils.issue_search import search_issues
@@ -25,7 +33,9 @@ class GlobalSearchEndpoint(BaseAPIView):
for field in fields:
q |= Q(**{f"{field}__icontains": query})
return (
Workspace.objects.filter(q, workspace_member__member=self.request.user)
Workspace.objects.filter(
q, workspace_member__member=self.request.user
)
.distinct()
.values("name", "id", "slug")
)
@@ -38,7 +48,8 @@ class GlobalSearchEndpoint(BaseAPIView):
return (
Project.objects.filter(
q,
Q(project_projectmember__member=self.request.user) | Q(network=2),
Q(project_projectmember__member=self.request.user)
| Q(network=2),
workspace__slug=slug,
)
.distinct()
@@ -169,7 +180,9 @@ class GlobalSearchEndpoint(BaseAPIView):
def get(self, request, slug):
query = request.query_params.get("search", False)
workspace_search = request.query_params.get("workspace_search", "false")
workspace_search = request.query_params.get(
"workspace_search", "false"
)
project_id = request.query_params.get("project_id", False)
if not query:
@@ -209,7 +222,9 @@ class GlobalSearchEndpoint(BaseAPIView):
class IssueSearchEndpoint(BaseAPIView):
def get(self, request, slug, project_id):
query = request.query_params.get("search", False)
workspace_search = request.query_params.get("workspace_search", "false")
workspace_search = request.query_params.get(
"workspace_search", "false"
)
parent = request.query_params.get("parent", "false")
issue_relation = request.query_params.get("issue_relation", "false")
cycle = request.query_params.get("cycle", "false")
@@ -234,9 +249,9 @@ class IssueSearchEndpoint(BaseAPIView):
issues = issues.filter(
~Q(pk=issue_id), ~Q(pk=issue.parent_id), parent__isnull=True
).exclude(
pk__in=Issue.issue_objects.filter(parent__isnull=False).values_list(
"parent_id", flat=True
)
pk__in=Issue.issue_objects.filter(
parent__isnull=False
).values_list("parent_id", flat=True)
)
if issue_relation == "true" and issue_id:
issue = Issue.issue_objects.get(pk=issue_id)

View File

@@ -77,14 +77,19 @@ class StateViewSet(BaseViewSet):
)
if state.default:
return Response({"error": "Default state cannot be deleted"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{"error": "Default state cannot be deleted"},
status=status.HTTP_400_BAD_REQUEST,
)
# Check for any issues in the state
issue_exist = Issue.issue_objects.filter(state=pk).exists()
if issue_exist:
return Response(
{"error": "The state is not empty, only empty states can be deleted"},
{
"error": "The state is not empty, only empty states can be deleted"
},
status=status.HTTP_400_BAD_REQUEST,
)

View File

@@ -43,7 +43,9 @@ class UserEndpoint(BaseViewSet):
is_admin = InstanceAdmin.objects.filter(
instance=instance, user=request.user
).exists()
return Response({"is_instance_admin": is_admin}, status=status.HTTP_200_OK)
return Response(
{"is_instance_admin": is_admin}, status=status.HTTP_200_OK
)
def deactivate(self, request):
# Check all workspace user is active
@@ -51,7 +53,12 @@ class UserEndpoint(BaseViewSet):
# Instance admin check
if InstanceAdmin.objects.filter(user=user).exists():
return Response({"error": "You cannot deactivate your account since you are an instance admin"}, status=status.HTTP_400_BAD_REQUEST)
return Response(
{
"error": "You cannot deactivate your account since you are an instance admin"
},
status=status.HTTP_400_BAD_REQUEST,
)
projects_to_deactivate = []
workspaces_to_deactivate = []
@@ -61,7 +68,10 @@ class UserEndpoint(BaseViewSet):
).annotate(
other_admin_exists=Count(
Case(
When(Q(role=20, is_active=True) & ~Q(member=request.user), then=1),
When(
Q(role=20, is_active=True) & ~Q(member=request.user),
then=1,
),
default=0,
output_field=IntegerField(),
)
@@ -86,7 +96,10 @@ class UserEndpoint(BaseViewSet):
).annotate(
other_admin_exists=Count(
Case(
When(Q(role=20, is_active=True) & ~Q(member=request.user), then=1),
When(
Q(role=20, is_active=True) & ~Q(member=request.user),
then=1,
),
default=0,
output_field=IntegerField(),
)
@@ -95,7 +108,9 @@ class UserEndpoint(BaseViewSet):
)
for workspace in workspaces:
if workspace.other_admin_exists > 0 or (workspace.total_members == 1):
if workspace.other_admin_exists > 0 or (
workspace.total_members == 1
):
workspace.is_active = False
workspaces_to_deactivate.append(workspace)
else:
@@ -134,7 +149,9 @@ class UpdateUserOnBoardedEndpoint(BaseAPIView):
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_onboarded = request.data.get("is_onboarded", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
return Response(
{"message": "Updated successfully"}, status=status.HTTP_200_OK
)
class UpdateUserTourCompletedEndpoint(BaseAPIView):
@@ -142,14 +159,16 @@ class UpdateUserTourCompletedEndpoint(BaseAPIView):
user = User.objects.get(pk=request.user.id, is_active=True)
user.is_tour_completed = request.data.get("is_tour_completed", False)
user.save()
return Response({"message": "Updated successfully"}, status=status.HTTP_200_OK)
return Response(
{"message": "Updated successfully"}, status=status.HTTP_200_OK
)
class UserActivityEndpoint(BaseAPIView, BasePaginator):
def get(self, request):
queryset = IssueActivity.objects.filter(actor=request.user).select_related(
"actor", "workspace", "issue", "project"
)
queryset = IssueActivity.objects.filter(
actor=request.user
).select_related("actor", "workspace", "issue", "project")
return self.paginate(
request=request,
@@ -158,4 +177,3 @@ class UserActivityEndpoint(BaseAPIView, BasePaginator):
issue_activities, many=True
).data,
)

View File

@@ -24,10 +24,15 @@ from . import BaseViewSet, BaseAPIView
from plane.app.serializers import (
GlobalViewSerializer,
IssueViewSerializer,
IssueLiteSerializer,
IssueSerializer,
IssueViewFavoriteSerializer,
)
from plane.app.permissions import WorkspaceEntityPermission, ProjectEntityPermission
from plane.app.permissions import (
WorkspaceEntityPermission,
ProjectEntityPermission,
WorkspaceViewerPermission,
ProjectLitePermission,
)
from plane.db.models import (
Workspace,
GlobalView,
@@ -37,14 +42,15 @@ from plane.db.models import (
IssueReaction,
IssueLink,
IssueAttachment,
IssueSubscriber,
)
from plane.utils.issue_filters import issue_filters
from plane.utils.grouper import group_results
class GlobalViewViewSet(BaseViewSet):
serializer_class = GlobalViewSerializer
model = GlobalView
serializer_class = IssueViewSerializer
model = IssueView
permission_classes = [
WorkspaceEntityPermission,
]
@@ -58,6 +64,7 @@ class GlobalViewViewSet(BaseViewSet):
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
.filter(project__isnull=True)
.select_related("workspace")
.order_by(self.request.GET.get("order_by", "-created_at"))
.distinct()
@@ -72,7 +79,9 @@ class GlobalViewIssuesViewSet(BaseViewSet):
def get_queryset(self):
return (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
@@ -95,11 +104,21 @@ class GlobalViewIssuesViewSet(BaseViewSet):
@method_decorator(gzip_page)
def list(self, request, slug):
filters = issue_filters(request.query_params, "GET")
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
@@ -116,17 +135,36 @@ class GlobalViewIssuesViewSet(BaseViewSet):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
is_subscribed=Exists(
IssueSubscriber.objects.filter(
subscriber=self.request.user, issue_id=OuterRef("id")
)
)
)
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -174,17 +212,17 @@ class GlobalViewIssuesViewSet(BaseViewSet):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(issue_queryset, many=True, fields=fields if fields else None).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(
issue_dict,
status=status.HTTP_200_OK,
serializer = IssueSerializer(
issue_queryset, many=True, fields=fields if fields else None
)
return Response(serializer.data, status=status.HTTP_200_OK)
class IssueViewViewSet(BaseViewSet):
@@ -217,6 +255,18 @@ class IssueViewViewSet(BaseViewSet):
.distinct()
)
def list(self, request, slug, project_id):
queryset = self.get_queryset()
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
views = IssueViewSerializer(
queryset, many=True, fields=fields if fields else None
).data
return Response(views, status=status.HTTP_200_OK)
class IssueViewFavoriteViewSet(BaseViewSet):
serializer_class = IssueViewFavoriteSerializer

View File

@@ -26,8 +26,12 @@ class WebhookEndpoint(BaseAPIView):
)
if serializer.is_valid():
serializer.save(workspace_id=workspace.id)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
serializer.errors, status=status.HTTP_400_BAD_REQUEST
)
except IntegrityError as e:
if "already exists" in str(e):
return Response(

View File

@@ -41,9 +41,11 @@ from plane.app.serializers import (
ProjectMemberSerializer,
WorkspaceThemeSerializer,
IssueActivitySerializer,
IssueLiteSerializer,
IssueSerializer,
WorkspaceMemberAdminSerializer,
WorkspaceMemberMeSerializer,
ProjectMemberRoleSerializer,
WorkspaceUserPropertiesSerializer,
)
from plane.app.views.base import BaseAPIView
from . import BaseViewSet
@@ -64,6 +66,7 @@ from plane.db.models import (
WorkspaceMember,
CycleIssue,
IssueReaction,
WorkspaceUserProperties,
)
from plane.app.permissions import (
WorkSpaceBasePermission,
@@ -71,11 +74,13 @@ from plane.app.permissions import (
WorkspaceEntityPermission,
WorkspaceViewerPermission,
WorkspaceUserPermission,
ProjectLitePermission,
)
from plane.bgtasks.workspace_invitation_task import workspace_invitation
from plane.utils.issue_filters import issue_filters
from plane.bgtasks.event_tracking_task import workspace_invite_event
class WorkSpaceViewSet(BaseViewSet):
model = Workspace
serializer_class = WorkSpaceSerializer
@@ -111,7 +116,9 @@ class WorkSpaceViewSet(BaseViewSet):
.values("count")
)
return (
self.filter_queryset(super().get_queryset().select_related("owner"))
self.filter_queryset(
super().get_queryset().select_related("owner")
)
.order_by("name")
.filter(
workspace_member__member=self.request.user,
@@ -137,7 +144,9 @@ class WorkSpaceViewSet(BaseViewSet):
if len(name) > 80 or len(slug) > 48:
return Response(
{"error": "The maximum length for name is 80 and for slug is 48"},
{
"error": "The maximum length for name is 80 and for slug is 48"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -150,7 +159,9 @@ class WorkSpaceViewSet(BaseViewSet):
role=20,
company_role=request.data.get("company_role", ""),
)
return Response(serializer.data, status=status.HTTP_201_CREATED)
return Response(
serializer.data, status=status.HTTP_201_CREATED
)
return Response(
[serializer.errors[error][0] for error in serializer.errors],
status=status.HTTP_400_BAD_REQUEST,
@@ -173,6 +184,11 @@ class UserWorkSpacesEndpoint(BaseAPIView):
]
def get(self, request):
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
member_count = (
WorkspaceMember.objects.filter(
workspace=OuterRef("id"),
@@ -204,13 +220,17 @@ class UserWorkSpacesEndpoint(BaseAPIView):
.annotate(total_members=member_count)
.annotate(total_issues=issue_count)
.filter(
workspace_member__member=request.user, workspace_member__is_active=True
workspace_member__member=request.user,
workspace_member__is_active=True,
)
.distinct()
)
serializer = WorkSpaceSerializer(self.filter_queryset(workspace), many=True)
return Response(serializer.data, status=status.HTTP_200_OK)
workspaces = WorkSpaceSerializer(
self.filter_queryset(workspace),
fields=fields if fields else None,
many=True,
).data
return Response(workspaces, status=status.HTTP_200_OK)
class WorkSpaceAvailabilityCheckEndpoint(BaseAPIView):
@@ -250,7 +270,8 @@ class WorkspaceInvitationsViewset(BaseViewSet):
# Check if email is provided
if not emails:
return Response(
{"error": "Emails are required"}, status=status.HTTP_400_BAD_REQUEST
{"error": "Emails are required"},
status=status.HTTP_400_BAD_REQUEST,
)
# check for role level of the requesting user
@@ -537,10 +558,15 @@ class WorkSpaceMemberViewSet(BaseViewSet):
workspace_members = self.get_queryset()
if workspace_member.role > 10:
serializer = WorkspaceMemberAdminSerializer(workspace_members, many=True)
serializer = WorkspaceMemberAdminSerializer(
workspace_members,
fields=("id", "member", "role"),
many=True,
)
else:
serializer = WorkSpaceMemberSerializer(
workspace_members,
fields=("id", "member", "role"),
many=True,
)
return Response(serializer.data, status=status.HTTP_200_OK)
@@ -572,7 +598,9 @@ class WorkSpaceMemberViewSet(BaseViewSet):
> requested_workspace_member.role
):
return Response(
{"error": "You cannot update a role that is higher than your own role"},
{
"error": "You cannot update a role that is higher than your own role"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -611,7 +639,9 @@ class WorkSpaceMemberViewSet(BaseViewSet):
if requesting_workspace_member.role < workspace_member.role:
return Response(
{"error": "You cannot remove a user having role higher than you"},
{
"error": "You cannot remove a user having role higher than you"
},
status=status.HTTP_400_BAD_REQUEST,
)
@@ -705,6 +735,49 @@ class WorkSpaceMemberViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
class WorkspaceProjectMemberEndpoint(BaseAPIView):
serializer_class = ProjectMemberRoleSerializer
model = ProjectMember
permission_classes = [
WorkspaceEntityPermission,
]
def get(self, request, slug):
# Fetch all project IDs where the user is involved
project_ids = (
ProjectMember.objects.filter(
member=request.user,
member__is_bot=False,
is_active=True,
)
.values_list("project_id", flat=True)
.distinct()
)
# Get all the project members in which the user is involved
project_members = ProjectMember.objects.filter(
workspace__slug=slug,
member__is_bot=False,
project_id__in=project_ids,
is_active=True,
).select_related("project", "member", "workspace")
project_members = ProjectMemberRoleSerializer(
project_members, many=True
).data
project_members_dict = dict()
# Construct a dictionary with project_id as key and project_members as value
for project_member in project_members:
project_id = project_member.pop("project")
if str(project_id) not in project_members_dict:
project_members_dict[str(project_id)] = []
project_members_dict[str(project_id)].append(project_member)
return Response(project_members_dict, status=status.HTTP_200_OK)
class TeamMemberViewSet(BaseViewSet):
serializer_class = TeamSerializer
model = Team
@@ -739,7 +812,9 @@ class TeamMemberViewSet(BaseViewSet):
)
if len(members) != len(request.data.get("members", [])):
users = list(set(request.data.get("members", [])).difference(members))
users = list(
set(request.data.get("members", [])).difference(members)
)
users = User.objects.filter(pk__in=users)
serializer = UserLiteSerializer(users, many=True)
@@ -753,7 +828,9 @@ class TeamMemberViewSet(BaseViewSet):
workspace = Workspace.objects.get(slug=slug)
serializer = TeamSerializer(data=request.data, context={"workspace": workspace})
serializer = TeamSerializer(
data=request.data, context={"workspace": workspace}
)
if serializer.is_valid():
serializer.save()
return Response(serializer.data, status=status.HTTP_201_CREATED)
@@ -782,7 +859,9 @@ class UserLastProjectWithWorkspaceEndpoint(BaseAPIView):
workspace_id=last_workspace_id, member=request.user
).select_related("workspace", "project", "member", "workspace__owner")
project_member_serializer = ProjectMemberSerializer(project_member, many=True)
project_member_serializer = ProjectMemberSerializer(
project_member, many=True
)
return Response(
{
@@ -966,7 +1045,11 @@ class WorkspaceThemeViewSet(BaseViewSet):
serializer_class = WorkspaceThemeSerializer
def get_queryset(self):
return super().get_queryset().filter(workspace__slug=self.kwargs.get("slug"))
return (
super()
.get_queryset()
.filter(workspace__slug=self.kwargs.get("slug"))
)
def create(self, request, slug):
workspace = Workspace.objects.get(slug=slug)
@@ -1229,12 +1312,22 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
]
def get(self, request, slug, user_id):
fields = [field for field in request.GET.get("fields", "").split(",") if field]
fields = [
field
for field in request.GET.get("fields", "").split(",")
if field
]
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = ["backlog", "unstarted", "started", "completed", "cancelled"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
issue_queryset = (
@@ -1246,21 +1339,10 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
project__project_projectmember__member=request.user,
)
.filter(**filters)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.select_related("project", "workspace", "state", "parent")
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.order_by("-created_at")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
@@ -1268,17 +1350,28 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(issue=OuterRef("id"))
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(parent=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.order_by("created_at")
).distinct()
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order if order_by_param == "priority" else priority_order[::-1]
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
@@ -1326,16 +1419,17 @@ class WorkspaceUserProfileIssuesEndpoint(BaseAPIView):
else order_by_param
)
).order_by(
"-max_values" if order_by_param.startswith("-") else "max_values"
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssueLiteSerializer(
issues = IssueSerializer(
issue_queryset, many=True, fields=fields if fields else None
).data
issue_dict = {str(issue["id"]): issue for issue in issues}
return Response(issue_dict, status=status.HTTP_200_OK)
return Response(issues, status=status.HTTP_200_OK)
class WorkspaceLabelsEndpoint(BaseAPIView):
@@ -1347,5 +1441,43 @@ class WorkspaceLabelsEndpoint(BaseAPIView):
labels = Label.objects.filter(
workspace__slug=slug,
project__project_projectmember__member=request.user,
).values("parent", "name", "color", "id", "project_id", "workspace__slug")
).values(
"parent", "name", "color", "id", "project_id", "workspace__slug"
)
return Response(labels, status=status.HTTP_200_OK)
class WorkspaceUserPropertiesEndpoint(BaseAPIView):
permission_classes = [
WorkspaceViewerPermission,
]
def patch(self, request, slug):
workspace_properties = WorkspaceUserProperties.objects.get(
user=request.user,
workspace__slug=slug,
)
workspace_properties.filters = request.data.get(
"filters", workspace_properties.filters
)
workspace_properties.display_filters = request.data.get(
"display_filters", workspace_properties.display_filters
)
workspace_properties.display_properties = request.data.get(
"display_properties", workspace_properties.display_properties
)
workspace_properties.save()
serializer = WorkspaceUserPropertiesSerializer(workspace_properties)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request, slug):
(
workspace_properties,
_,
) = WorkspaceUserProperties.objects.get_or_create(
user=request.user, workspace__slug=slug
)
serializer = WorkspaceUserPropertiesSerializer(workspace_properties)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -101,7 +101,9 @@ def get_assignee_details(slug, filters):
def get_label_details(slug, filters):
"""Fetch label details if required"""
return (
Issue.objects.filter(workspace__slug=slug, **filters, labels__id__isnull=False)
Issue.objects.filter(
workspace__slug=slug, **filters, labels__id__isnull=False
)
.distinct("labels__id")
.order_by("labels__id")
.values("labels__id", "labels__color", "labels__name")
@@ -174,7 +176,9 @@ def generate_segmented_rows(
):
segment_zero = list(
set(
item.get("segment") for sublist in distribution.values() for item in sublist
item.get("segment")
for sublist in distribution.values()
for item in sublist
)
)
@@ -193,7 +197,9 @@ def generate_segmented_rows(
]
for segment in segment_zero:
value = next((x.get(key) for x in data if x.get("segment") == segment), "0")
value = next(
(x.get(key) for x in data if x.get("segment") == segment), "0"
)
generated_row.append(value)
if x_axis == ASSIGNEE_ID:
@@ -212,7 +218,11 @@ def generate_segmented_rows(
if x_axis == LABEL_ID:
label = next(
(lab for lab in label_details if str(lab[LABEL_ID]) == str(item)),
(
lab
for lab in label_details
if str(lab[LABEL_ID]) == str(item)
),
None,
)
@@ -221,7 +231,11 @@ def generate_segmented_rows(
if x_axis == STATE_ID:
state = next(
(sta for sta in state_details if str(sta[STATE_ID]) == str(item)),
(
sta
for sta in state_details
if str(sta[STATE_ID]) == str(item)
),
None,
)
@@ -230,7 +244,11 @@ def generate_segmented_rows(
if x_axis == CYCLE_ID:
cycle = next(
(cyc for cyc in cycle_details if str(cyc[CYCLE_ID]) == str(item)),
(
cyc
for cyc in cycle_details
if str(cyc[CYCLE_ID]) == str(item)
),
None,
)
@@ -239,7 +257,11 @@ def generate_segmented_rows(
if x_axis == MODULE_ID:
module = next(
(mod for mod in module_details if str(mod[MODULE_ID]) == str(item)),
(
mod
for mod in module_details
if str(mod[MODULE_ID]) == str(item)
),
None,
)
@@ -266,7 +288,11 @@ def generate_segmented_rows(
if segmented == LABEL_ID:
for index, segm in enumerate(row_zero[2:]):
label = next(
(lab for lab in label_details if str(lab[LABEL_ID]) == str(segm)),
(
lab
for lab in label_details
if str(lab[LABEL_ID]) == str(segm)
),
None,
)
if label:
@@ -275,7 +301,11 @@ def generate_segmented_rows(
if segmented == STATE_ID:
for index, segm in enumerate(row_zero[2:]):
state = next(
(sta for sta in state_details if str(sta[STATE_ID]) == str(segm)),
(
sta
for sta in state_details
if str(sta[STATE_ID]) == str(segm)
),
None,
)
if state:
@@ -284,7 +314,11 @@ def generate_segmented_rows(
if segmented == MODULE_ID:
for index, segm in enumerate(row_zero[2:]):
module = next(
(mod for mod in label_details if str(mod[MODULE_ID]) == str(segm)),
(
mod
for mod in label_details
if str(mod[MODULE_ID]) == str(segm)
),
None,
)
if module:
@@ -293,7 +327,11 @@ def generate_segmented_rows(
if segmented == CYCLE_ID:
for index, segm in enumerate(row_zero[2:]):
cycle = next(
(cyc for cyc in cycle_details if str(cyc[CYCLE_ID]) == str(segm)),
(
cyc
for cyc in cycle_details
if str(cyc[CYCLE_ID]) == str(segm)
),
None,
)
if cycle:
@@ -315,7 +353,10 @@ def generate_non_segmented_rows(
):
rows = []
for item, data in distribution.items():
row = [item, data[0].get("count" if y_axis == "issue_count" else "estimate")]
row = [
item,
data[0].get("count" if y_axis == "issue_count" else "estimate"),
]
if x_axis == ASSIGNEE_ID:
assignee = next(
@@ -333,7 +374,11 @@ def generate_non_segmented_rows(
if x_axis == LABEL_ID:
label = next(
(lab for lab in label_details if str(lab[LABEL_ID]) == str(item)),
(
lab
for lab in label_details
if str(lab[LABEL_ID]) == str(item)
),
None,
)
@@ -342,7 +387,11 @@ def generate_non_segmented_rows(
if x_axis == STATE_ID:
state = next(
(sta for sta in state_details if str(sta[STATE_ID]) == str(item)),
(
sta
for sta in state_details
if str(sta[STATE_ID]) == str(item)
),
None,
)
@@ -351,7 +400,11 @@ def generate_non_segmented_rows(
if x_axis == CYCLE_ID:
cycle = next(
(cyc for cyc in cycle_details if str(cyc[CYCLE_ID]) == str(item)),
(
cyc
for cyc in cycle_details
if str(cyc[CYCLE_ID]) == str(item)
),
None,
)
@@ -360,7 +413,11 @@ def generate_non_segmented_rows(
if x_axis == MODULE_ID:
module = next(
(mod for mod in module_details if str(mod[MODULE_ID]) == str(item)),
(
mod
for mod in module_details
if str(mod[MODULE_ID]) == str(item)
),
None,
)
@@ -369,7 +426,10 @@ def generate_non_segmented_rows(
rows.append(tuple(row))
row_zero = [row_mapping.get(x_axis, "X-Axis"), row_mapping.get(y_axis, "Y-Axis")]
row_zero = [
row_mapping.get(x_axis, "X-Axis"),
row_mapping.get(y_axis, "Y-Axis"),
]
return [tuple(row_zero)] + rows

View File

@@ -2,4 +2,4 @@ from django.apps import AppConfig
class BgtasksConfig(AppConfig):
name = 'plane.bgtasks'
name = "plane.bgtasks"

View File

@@ -47,15 +47,17 @@ def auth_events(user, email, user_agent, ip, event_name, medium, first_time):
"user_agent": user_agent,
},
"medium": medium,
"first_time": first_time
}
"first_time": first_time,
},
)
except Exception as e:
capture_exception(e)
@shared_task
def workspace_invite_event(user, email, user_agent, ip, event_name, accepted_from):
def workspace_invite_event(
user, email, user_agent, ip, event_name, accepted_from
):
try:
POSTHOG_API_KEY, POSTHOG_HOST = posthogConfiguration()
@@ -71,8 +73,8 @@ def workspace_invite_event(user, email, user_agent, ip, event_name, accepted_fro
"ip": ip,
"user_agent": user_agent,
},
"accepted_from": accepted_from
}
"accepted_from": accepted_from,
},
)
except Exception as e:
capture_exception(e)

View File

@@ -68,7 +68,9 @@ def create_zip_file(files):
def upload_to_s3(zip_file, workspace_id, token_id, slug):
file_name = f"{workspace_id}/export-{slug}-{token_id[:6]}-{timezone.now()}.zip"
file_name = (
f"{workspace_id}/export-{slug}-{token_id[:6]}-{timezone.now()}.zip"
)
expires_in = 7 * 24 * 60 * 60
if settings.USE_MINIO:
@@ -87,7 +89,10 @@ def upload_to_s3(zip_file, workspace_id, token_id, slug):
)
presigned_url = s3.generate_presigned_url(
"get_object",
Params={"Bucket": settings.AWS_STORAGE_BUCKET_NAME, "Key": file_name},
Params={
"Bucket": settings.AWS_STORAGE_BUCKET_NAME,
"Key": file_name,
},
ExpiresIn=expires_in,
)
# Create the new url with updated domain and protocol
@@ -112,7 +117,10 @@ def upload_to_s3(zip_file, workspace_id, token_id, slug):
presigned_url = s3.generate_presigned_url(
"get_object",
Params={"Bucket": settings.AWS_STORAGE_BUCKET_NAME, "Key": file_name},
Params={
"Bucket": settings.AWS_STORAGE_BUCKET_NAME,
"Key": file_name,
},
ExpiresIn=expires_in,
)
@@ -172,11 +180,17 @@ def generate_json_row(issue):
else "",
"Labels": issue["labels__name"],
"Cycle Name": issue["issue_cycle__cycle__name"],
"Cycle Start Date": dateConverter(issue["issue_cycle__cycle__start_date"]),
"Cycle Start Date": dateConverter(
issue["issue_cycle__cycle__start_date"]
),
"Cycle End Date": dateConverter(issue["issue_cycle__cycle__end_date"]),
"Module Name": issue["issue_module__module__name"],
"Module Start Date": dateConverter(issue["issue_module__module__start_date"]),
"Module Target Date": dateConverter(issue["issue_module__module__target_date"]),
"Module Start Date": dateConverter(
issue["issue_module__module__start_date"]
),
"Module Target Date": dateConverter(
issue["issue_module__module__target_date"]
),
"Created At": dateTimeConverter(issue["created_at"]),
"Updated At": dateTimeConverter(issue["updated_at"]),
"Completed At": dateTimeConverter(issue["completed_at"]),
@@ -211,7 +225,11 @@ def update_json_row(rows, row):
def update_table_row(rows, row):
matched_index = next(
(index for index, existing_row in enumerate(rows) if existing_row[0] == row[0]),
(
index
for index, existing_row in enumerate(rows)
if existing_row[0] == row[0]
),
None,
)
@@ -260,7 +278,9 @@ def generate_xlsx(header, project_id, issues, files):
@shared_task
def issue_export_task(provider, workspace_id, project_ids, token_id, multiple, slug):
def issue_export_task(
provider, workspace_id, project_ids, token_id, multiple, slug
):
try:
exporter_instance = ExporterHistory.objects.get(token=token_id)
exporter_instance.status = "processing"
@@ -273,9 +293,14 @@ def issue_export_task(provider, workspace_id, project_ids, token_id, multiple, s
project_id__in=project_ids,
project__project_projectmember__member=exporter_instance.initiated_by_id,
)
.select_related("project", "workspace", "state", "parent", "created_by")
.select_related(
"project", "workspace", "state", "parent", "created_by"
)
.prefetch_related(
"assignees", "labels", "issue_cycle__cycle", "issue_module__module"
"assignees",
"labels",
"issue_cycle__cycle",
"issue_module__module",
)
.values(
"id",

View File

@@ -19,7 +19,8 @@ from plane.db.models import ExporterHistory
def delete_old_s3_link():
# Get a list of keys and IDs to process
expired_exporter_history = ExporterHistory.objects.filter(
Q(url__isnull=False) & Q(created_at__lte=timezone.now() - timedelta(days=8))
Q(url__isnull=False)
& Q(created_at__lte=timezone.now() - timedelta(days=8))
).values_list("key", "id")
if settings.USE_MINIO:
s3 = boto3.client(
@@ -42,8 +43,12 @@ def delete_old_s3_link():
# Delete object from S3
if file_name:
if settings.USE_MINIO:
s3.delete_object(Bucket=settings.AWS_STORAGE_BUCKET_NAME, Key=file_name)
s3.delete_object(
Bucket=settings.AWS_STORAGE_BUCKET_NAME, Key=file_name
)
else:
s3.delete_object(Bucket=settings.AWS_STORAGE_BUCKET_NAME, Key=file_name)
s3.delete_object(
Bucket=settings.AWS_STORAGE_BUCKET_NAME, Key=file_name
)
ExporterHistory.objects.filter(id=exporter_id).update(url=None)

View File

@@ -14,10 +14,10 @@ from plane.db.models import FileAsset
@shared_task
def delete_file_asset():
# file assets to delete
file_assets_to_delete = FileAsset.objects.filter(
Q(is_deleted=True) & Q(updated_at__lte=timezone.now() - timedelta(days=7))
Q(is_deleted=True)
& Q(updated_at__lte=timezone.now() - timedelta(days=7))
)
# Delete the file from storage and the file object from the database
@@ -26,4 +26,3 @@ def delete_file_asset():
file_asset.asset.delete(save=False)
# Delete the file object
file_asset.delete()

View File

@@ -42,7 +42,9 @@ def forgot_password(first_name, email, uidb64, token, current_site):
"email": email,
}
html_content = render_to_string("emails/auth/forgot_password.html", context)
html_content = render_to_string(
"emails/auth/forgot_password.html", context
)
text_content = strip_tags(html_content)

View File

@@ -120,12 +120,17 @@ def service_importer(service, importer_id):
repository_id = importer.metadata.get("repository_id", False)
workspace_integration = WorkspaceIntegration.objects.get(
workspace_id=importer.workspace_id, integration__provider="github"
workspace_id=importer.workspace_id,
integration__provider="github",
)
# Delete the old repository object
GithubRepositorySync.objects.filter(project_id=importer.project_id).delete()
GithubRepository.objects.filter(project_id=importer.project_id).delete()
GithubRepositorySync.objects.filter(
project_id=importer.project_id
).delete()
GithubRepository.objects.filter(
project_id=importer.project_id
).delete()
# Create a Label for github
label = Label.objects.filter(

View File

@@ -112,8 +112,16 @@ def track_parent(
epoch,
):
if current_instance.get("parent") != requested_data.get("parent"):
old_parent = Issue.objects.filter(pk=current_instance.get("parent")).first() if current_instance.get("parent") is not None else None
new_parent = Issue.objects.filter(pk=requested_data.get("parent")).first() if requested_data.get("parent") is not None else None
old_parent = (
Issue.objects.filter(pk=current_instance.get("parent")).first()
if current_instance.get("parent") is not None
else None
)
new_parent = (
Issue.objects.filter(pk=requested_data.get("parent")).first()
if requested_data.get("parent") is not None
else None
)
issue_activities.append(
IssueActivity(
@@ -130,8 +138,12 @@ def track_parent(
project_id=project_id,
workspace_id=workspace_id,
comment=f"updated the parent issue to",
old_identifier=old_parent.id if old_parent is not None else None,
new_identifier=new_parent.id if new_parent is not None else None,
old_identifier=old_parent.id
if old_parent is not None
else None,
new_identifier=new_parent.id
if new_parent is not None
else None,
epoch=epoch,
)
)
@@ -209,7 +221,9 @@ def track_target_date(
issue_activities,
epoch,
):
if current_instance.get("target_date") != requested_data.get("target_date"):
if current_instance.get("target_date") != requested_data.get(
"target_date"
):
issue_activities.append(
IssueActivity(
issue_id=issue_id,
@@ -273,8 +287,12 @@ def track_labels(
issue_activities,
epoch,
):
requested_labels = set([str(lab) for lab in requested_data.get("labels", [])])
current_labels = set([str(lab) for lab in current_instance.get("labels", [])])
requested_labels = set(
[str(lab) for lab in requested_data.get("labels", [])]
)
current_labels = set(
[str(lab) for lab in current_instance.get("labels", [])]
)
added_labels = requested_labels - current_labels
dropped_labels = current_labels - requested_labels
@@ -331,8 +349,12 @@ def track_assignees(
issue_activities,
epoch,
):
requested_assignees = set([str(asg) for asg in requested_data.get("assignees", [])])
current_assignees = set([str(asg) for asg in current_instance.get("assignees", [])])
requested_assignees = set(
[str(asg) for asg in requested_data.get("assignees", [])]
)
current_assignees = set(
[str(asg) for asg in current_instance.get("assignees", [])]
)
added_assignees = requested_assignees - current_assignees
dropped_assginees = current_assignees - requested_assignees
@@ -384,7 +406,9 @@ def track_estimate_points(
issue_activities,
epoch,
):
if current_instance.get("estimate_point") != requested_data.get("estimate_point"):
if current_instance.get("estimate_point") != requested_data.get(
"estimate_point"
):
issue_activities.append(
IssueActivity(
issue_id=issue_id,
@@ -415,7 +439,9 @@ def track_archive_at(
issue_activities,
epoch,
):
if current_instance.get("archived_at") != requested_data.get("archived_at"):
if current_instance.get("archived_at") != requested_data.get(
"archived_at"
):
if requested_data.get("archived_at") is None:
issue_activities.append(
IssueActivity(
@@ -528,7 +554,9 @@ def update_issue_activity(
"closed_to": track_closed_to,
}
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -581,7 +609,9 @@ def create_comment_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -613,12 +643,16 @@ def update_comment_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
if current_instance.get("comment_html") != requested_data.get("comment_html"):
if current_instance.get("comment_html") != requested_data.get(
"comment_html"
):
issue_activities.append(
IssueActivity(
issue_id=issue_id,
@@ -672,14 +706,18 @@ def create_cycle_issue_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
# Updated Records:
updated_records = current_instance.get("updated_cycle_issues", [])
created_records = json.loads(current_instance.get("created_cycle_issues", []))
created_records = json.loads(
current_instance.get("created_cycle_issues", [])
)
for updated_record in updated_records:
old_cycle = Cycle.objects.filter(
@@ -714,7 +752,9 @@ def create_cycle_issue_activity(
cycle = Cycle.objects.filter(
pk=created_record.get("fields").get("cycle")
).first()
issue = Issue.objects.filter(pk=created_record.get("fields").get("issue")).first()
issue = Issue.objects.filter(
pk=created_record.get("fields").get("issue")
).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
@@ -746,7 +786,9 @@ def delete_cycle_issue_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -788,14 +830,18 @@ def create_module_issue_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
# Updated Records:
updated_records = current_instance.get("updated_module_issues", [])
created_records = json.loads(current_instance.get("created_module_issues", []))
created_records = json.loads(
current_instance.get("created_module_issues", [])
)
for updated_record in updated_records:
old_module = Module.objects.filter(
@@ -830,7 +876,9 @@ def create_module_issue_activity(
module = Module.objects.filter(
pk=created_record.get("fields").get("module")
).first()
issue = Issue.objects.filter(pk=created_record.get("fields").get("issue")).first()
issue = Issue.objects.filter(
pk=created_record.get("fields").get("issue")
).first()
if issue:
issue.updated_at = timezone.now()
issue.save(update_fields=["updated_at"])
@@ -861,7 +909,9 @@ def delete_module_issue_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -903,7 +953,9 @@ def create_link_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -934,7 +986,9 @@ def update_link_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -998,7 +1052,9 @@ def create_attachment_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -1053,7 +1109,9 @@ def create_issue_reaction_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
if requested_data and requested_data.get("reaction") is not None:
issue_reaction = (
IssueReaction.objects.filter(
@@ -1125,7 +1183,9 @@ def create_comment_reaction_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
if requested_data and requested_data.get("reaction") is not None:
comment_reaction_id, comment_id = (
CommentReaction.objects.filter(
@@ -1136,7 +1196,9 @@ def create_comment_reaction_activity(
.values_list("id", "comment__id")
.first()
)
comment = IssueComment.objects.get(pk=comment_id, project_id=project_id)
comment = IssueComment.objects.get(
pk=comment_id, project_id=project_id
)
if (
comment is not None
and comment_reaction_id is not None
@@ -1210,7 +1272,9 @@ def create_issue_vote_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
if requested_data and requested_data.get("vote") is not None:
issue_activities.append(
IssueActivity(
@@ -1272,44 +1336,48 @@ def create_issue_relation_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
if current_instance is None and requested_data.get("related_list") is not None:
for issue_relation in requested_data.get("related_list"):
if issue_relation.get("relation_type") == "blocked_by":
relation_type = "blocking"
else:
relation_type = issue_relation.get("relation_type")
issue = Issue.objects.get(pk=issue_relation.get("issue"))
if current_instance is None and requested_data.get("issues") is not None:
for related_issue in requested_data.get("issues"):
issue = Issue.objects.get(pk=related_issue)
issue_activities.append(
IssueActivity(
issue_id=issue_relation.get("related_issue"),
issue_id=issue_id,
actor_id=actor_id,
verb="created",
old_value="",
new_value=f"{issue.project.identifier}-{issue.sequence_id}",
field=relation_type,
field=requested_data.get("relation_type"),
project_id=project_id,
workspace_id=workspace_id,
comment=f"added {relation_type} relation",
old_identifier=issue_relation.get("issue"),
comment=f"added {requested_data.get('relation_type')} relation",
old_identifier=related_issue,
)
)
issue = Issue.objects.get(pk=issue_relation.get("related_issue"))
issue = Issue.objects.get(pk=issue_id)
issue_activities.append(
IssueActivity(
issue_id=issue_relation.get("issue"),
issue_id=related_issue,
actor_id=actor_id,
verb="created",
old_value="",
new_value=f"{issue.project.identifier}-{issue.sequence_id}",
field=f'{issue_relation.get("relation_type")}',
field="blocking"
if requested_data.get("relation_type") == "blocked_by"
else (
"blocked_by"
if requested_data.get("relation_type") == "blocking"
else requested_data.get("relation_type")
),
project_id=project_id,
workspace_id=workspace_id,
comment=f'added {issue_relation.get("relation_type")} relation',
old_identifier=issue_relation.get("related_issue"),
comment=f'added {"blocking" if requested_data.get("relation_type") == "blocked_by" else ("blocked_by" if requested_data.get("relation_type") == "blocking" else requested_data.get("relation_type")),} relation',
old_identifier=issue_id,
epoch=epoch,
)
)
@@ -1325,44 +1393,47 @@ def delete_issue_relation_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
if current_instance is not None and requested_data.get("related_list") is None:
if current_instance.get("relation_type") == "blocked_by":
relation_type = "blocking"
else:
relation_type = current_instance.get("relation_type")
issue = Issue.objects.get(pk=current_instance.get("issue"))
issue = Issue.objects.get(pk=requested_data.get("related_issue"))
issue_activities.append(
IssueActivity(
issue_id=current_instance.get("related_issue"),
issue_id=issue_id,
actor_id=actor_id,
verb="deleted",
old_value=f"{issue.project.identifier}-{issue.sequence_id}",
new_value="",
field=relation_type,
field=requested_data.get("relation_type"),
project_id=project_id,
workspace_id=workspace_id,
comment=f"deleted {relation_type} relation",
old_identifier=current_instance.get("issue"),
comment=f"deleted {requested_data.get('relation_type')} relation",
old_identifier=requested_data.get("related_issue"),
epoch=epoch,
)
)
issue = Issue.objects.get(pk=current_instance.get("related_issue"))
issue = Issue.objects.get(pk=issue_id)
issue_activities.append(
IssueActivity(
issue_id=current_instance.get("issue"),
issue_id=requested_data.get("related_issue"),
actor_id=actor_id,
verb="deleted",
old_value=f"{issue.project.identifier}-{issue.sequence_id}",
new_value="",
field=f'{current_instance.get("relation_type")}',
field="blocking"
if requested_data.get("relation_type") == "blocked_by"
else (
"blocked_by"
if requested_data.get("relation_type") == "blocking"
else requested_data.get("relation_type")
),
project_id=project_id,
workspace_id=workspace_id,
comment=f'deleted {current_instance.get("relation_type")} relation',
old_identifier=current_instance.get("related_issue"),
comment=f'deleted {requested_data.get("relation_type")} relation',
old_identifier=requested_data.get("related_issue"),
epoch=epoch,
)
)
@@ -1402,7 +1473,9 @@ def update_draft_issue_activity(
issue_activities,
epoch,
):
requested_data = json.loads(requested_data) if requested_data is not None else None
requested_data = (
json.loads(requested_data) if requested_data is not None else None
)
current_instance = (
json.loads(current_instance) if current_instance is not None else None
)
@@ -1529,7 +1602,9 @@ def issue_activity(
)
# Save all the values to database
issue_activities_created = IssueActivity.objects.bulk_create(issue_activities)
issue_activities_created = IssueActivity.objects.bulk_create(
issue_activities
)
# Post the updates to segway for integrations and webhooks
if len(issue_activities_created):
# Don't send activities if the actor is a bot
@@ -1556,7 +1631,9 @@ def issue_activity(
project_id=project_id,
subscriber=subscriber,
issue_activities_created=json.dumps(
IssueActivitySerializer(issue_activities_created, many=True).data,
IssueActivitySerializer(
issue_activities_created, many=True
).data,
cls=DjangoJSONEncoder,
),
requested_data=requested_data,

View File

@@ -36,7 +36,9 @@ def archive_old_issues():
Q(
project=project_id,
archived_at__isnull=True,
updated_at__lte=(timezone.now() - timedelta(days=archive_in * 30)),
updated_at__lte=(
timezone.now() - timedelta(days=archive_in * 30)
),
state__group__in=["completed", "cancelled"],
),
Q(issue_cycle__isnull=True)
@@ -46,7 +48,9 @@ def archive_old_issues():
),
Q(issue_module__isnull=True)
| (
Q(issue_module__module__target_date__lt=timezone.now().date())
Q(
issue_module__module__target_date__lt=timezone.now().date()
)
& Q(issue_module__isnull=False)
),
).filter(
@@ -74,7 +78,9 @@ def archive_old_issues():
_ = [
issue_activity.delay(
type="issue.activity.updated",
requested_data=json.dumps({"archived_at": str(archive_at)}),
requested_data=json.dumps(
{"archived_at": str(archive_at)}
),
actor_id=str(project.created_by_id),
issue_id=issue.id,
project_id=project_id,
@@ -108,7 +114,9 @@ def close_old_issues():
Q(
project=project_id,
archived_at__isnull=True,
updated_at__lte=(timezone.now() - timedelta(days=close_in * 30)),
updated_at__lte=(
timezone.now() - timedelta(days=close_in * 30)
),
state__group__in=["backlog", "unstarted", "started"],
),
Q(issue_cycle__isnull=True)
@@ -118,7 +126,9 @@ def close_old_issues():
),
Q(issue_module__isnull=True)
| (
Q(issue_module__module__target_date__lt=timezone.now().date())
Q(
issue_module__module__target_date__lt=timezone.now().date()
)
& Q(issue_module__isnull=False)
),
).filter(
@@ -131,7 +141,9 @@ def close_old_issues():
# Check if Issues
if issues:
if project.default_state is None:
close_state = State.objects.filter(group="cancelled").first()
close_state = State.objects.filter(
group="cancelled"
).first()
else:
close_state = project.default_state

View File

@@ -33,7 +33,9 @@ def magic_link(email, key, token, current_site):
subject = f"Your unique Plane login code is {token}"
context = {"code": token, "email": email}
html_content = render_to_string("emails/auth/magic_signin.html", context)
html_content = render_to_string(
"emails/auth/magic_signin.html", context
)
text_content = strip_tags(html_content)
connection = get_connection(

View File

@@ -12,7 +12,7 @@ from plane.db.models import (
Issue,
Notification,
IssueComment,
IssueActivity
IssueActivity,
)
# Third Party imports
@@ -20,9 +20,9 @@ from celery import shared_task
from bs4 import BeautifulSoup
# =========== Issue Description Html Parsing and Notification Functions ======================
def update_mentions_for_issue(issue, project, new_mentions, removed_mention):
aggregated_issue_mentions = []
@@ -32,14 +32,14 @@ def update_mentions_for_issue(issue, project, new_mentions, removed_mention):
mention_id=mention_id,
issue=issue,
project=project,
workspace_id=project.workspace_id
workspace_id=project.workspace_id,
)
)
IssueMention.objects.bulk_create(
aggregated_issue_mentions, batch_size=100)
IssueMention.objects.bulk_create(aggregated_issue_mentions, batch_size=100)
IssueMention.objects.filter(
issue=issue, mention__in=removed_mention).delete()
issue=issue, mention__in=removed_mention
).delete()
def get_new_mentions(requested_instance, current_instance):
@@ -53,10 +53,12 @@ def get_new_mentions(requested_instance, current_instance):
# Getting Set Difference from mentions_newer
new_mentions = [
mention for mention in mentions_newer if mention not in mentions_older]
mention for mention in mentions_newer if mention not in mentions_older
]
return new_mentions
# Get Removed Mention
@@ -70,10 +72,12 @@ def get_removed_mentions(requested_instance, current_instance):
# Getting Set Difference from mentions_newer
removed_mentions = [
mention for mention in mentions_older if mention not in mentions_newer]
mention for mention in mentions_older if mention not in mentions_newer
]
return removed_mentions
# Adds mentions as subscribers
@@ -84,27 +88,34 @@ def extract_mentions_as_subscribers(project_id, issue_id, mentions):
for mention_id in mentions:
# If the particular mention has not already been subscribed to the issue, he must be sent the mentioned notification
if not IssueSubscriber.objects.filter(
if (
not IssueSubscriber.objects.filter(
issue_id=issue_id,
subscriber_id=mention_id,
project_id=project_id,
).exists() and not IssueAssignee.objects.filter(
project_id=project_id, issue_id=issue_id,
assignee_id=mention_id
).exists() and not Issue.objects.filter(
).exists()
and not IssueAssignee.objects.filter(
project_id=project_id,
issue_id=issue_id,
assignee_id=mention_id,
).exists()
and not Issue.objects.filter(
project_id=project_id, pk=issue_id, created_by_id=mention_id
).exists():
).exists()
):
project = Project.objects.get(pk=project_id)
bulk_mention_subscribers.append(IssueSubscriber(
bulk_mention_subscribers.append(
IssueSubscriber(
workspace_id=project.workspace_id,
project_id=project_id,
issue_id=issue_id,
subscriber_id=mention_id,
))
)
)
return bulk_mention_subscribers
# Parse Issue Description & extracts mentions
def extract_mentions(issue_instance):
try:
@@ -113,11 +124,12 @@ def extract_mentions(issue_instance):
# Convert string to dictionary
data = json.loads(issue_instance)
html = data.get("description_html")
soup = BeautifulSoup(html, 'html.parser')
soup = BeautifulSoup(html, "html.parser")
mention_tags = soup.find_all(
'mention-component', attrs={'target': 'users'})
"mention-component", attrs={"target": "users"}
)
mentions = [mention_tag['id'] for mention_tag in mention_tags]
mentions = [mention_tag["id"] for mention_tag in mention_tags]
return list(set(mentions))
except Exception as e:
@@ -128,18 +140,18 @@ def extract_mentions(issue_instance):
def extract_comment_mentions(comment_value):
try:
mentions = []
soup = BeautifulSoup(comment_value, 'html.parser')
soup = BeautifulSoup(comment_value, "html.parser")
mentions_tags = soup.find_all(
'mention-component', attrs={'target': 'users'}
"mention-component", attrs={"target": "users"}
)
for mention_tag in mentions_tags:
mentions.append(mention_tag['id'])
mentions.append(mention_tag["id"])
return list(set(mentions))
except Exception as e:
return []
def get_new_comment_mentions(new_value, old_value):
def get_new_comment_mentions(new_value, old_value):
mentions_newer = extract_comment_mentions(new_value)
if old_value is None:
return mentions_newer
@@ -147,12 +159,21 @@ def get_new_comment_mentions(new_value, old_value):
mentions_older = extract_comment_mentions(old_value)
# Getting Set Difference from mentions_newer
new_mentions = [
mention for mention in mentions_newer if mention not in mentions_older]
mention for mention in mentions_newer if mention not in mentions_older
]
return new_mentions
def createMentionNotification(project, notification_comment, issue, actor_id, mention_id, issue_id, activity):
def createMentionNotification(
project,
notification_comment,
issue,
actor_id,
mention_id,
issue_id,
activity,
):
return Notification(
workspace=project.workspace,
sender="in_app:issue_activities:mentioned",
@@ -178,16 +199,26 @@ def createMentionNotification(project, notification_comment, issue, actor_id, me
"actor": str(activity.get("actor_id")),
"new_value": str(activity.get("new_value")),
"old_value": str(activity.get("old_value")),
}
},
},
)
@shared_task
def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activities_created, requested_data, current_instance):
def notifications(
type,
issue_id,
project_id,
actor_id,
subscriber,
issue_activities_created,
requested_data,
current_instance,
):
issue_activities_created = (
json.loads(
issue_activities_created) if issue_activities_created is not None else None
json.loads(issue_activities_created)
if issue_activities_created is not None
else None
)
if type not in [
"issue.activity.deleted",
@@ -216,18 +247,24 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
# Get new mentions from the newer instance
new_mentions = get_new_mentions(
requested_instance=requested_data, current_instance=current_instance)
requested_instance=requested_data,
current_instance=current_instance,
)
removed_mention = get_removed_mentions(
requested_instance=requested_data, current_instance=current_instance)
requested_instance=requested_data,
current_instance=current_instance,
)
comment_mentions = []
all_comment_mentions = []
# Get New Subscribers from the mentions of the newer instance
requested_mentions = extract_mentions(
issue_instance=requested_data)
requested_mentions = extract_mentions(issue_instance=requested_data)
mention_subscribers = extract_mentions_as_subscribers(
project_id=project_id, issue_id=issue_id, mentions=requested_mentions)
project_id=project_id,
issue_id=issue_id,
mentions=requested_mentions,
)
for issue_activity in issue_activities_created:
issue_comment = issue_activity.get("issue_comment")
@@ -236,12 +273,22 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
if issue_comment is not None:
# TODO: Maybe save the comment mentions, so that in future, we can filter out the issues based on comment mentions as well.
all_comment_mentions = all_comment_mentions + extract_comment_mentions(issue_comment_new_value)
all_comment_mentions = (
all_comment_mentions
+ extract_comment_mentions(issue_comment_new_value)
)
new_comment_mentions = get_new_comment_mentions(old_value=issue_comment_old_value, new_value=issue_comment_new_value)
new_comment_mentions = get_new_comment_mentions(
old_value=issue_comment_old_value,
new_value=issue_comment_new_value,
)
comment_mentions = comment_mentions + new_comment_mentions
comment_mention_subscribers = extract_mentions_as_subscribers( project_id=project_id, issue_id=issue_id, mentions=all_comment_mentions)
comment_mention_subscribers = extract_mentions_as_subscribers(
project_id=project_id,
issue_id=issue_id,
mentions=all_comment_mentions,
)
"""
We will not send subscription activity notification to the below mentioned user sets
- Those who have been newly mentioned in the issue description, we will send mention notification to them.
@@ -251,50 +298,75 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
issue_assignees = list(
IssueAssignee.objects.filter(
project_id=project_id, issue_id=issue_id)
project_id=project_id, issue_id=issue_id
)
.exclude(assignee_id__in=list(new_mentions + comment_mentions))
.values_list("assignee", flat=True)
)
issue_subscribers = list(
IssueSubscriber.objects.filter(
project_id=project_id, issue_id=issue_id)
.exclude(subscriber_id__in=list(new_mentions + comment_mentions + [actor_id]))
project_id=project_id, issue_id=issue_id
)
.exclude(
subscriber_id__in=list(
new_mentions + comment_mentions + [actor_id]
)
)
.values_list("subscriber", flat=True)
)
issue = Issue.objects.filter(pk=issue_id).first()
if (issue.created_by_id is not None and str(issue.created_by_id) != str(actor_id)):
if issue.created_by_id is not None and str(issue.created_by_id) != str(
actor_id
):
issue_subscribers = issue_subscribers + [issue.created_by_id]
if subscriber:
# add the user to issue subscriber
try:
if str(issue.created_by_id) != str(actor_id) and uuid.UUID(actor_id) not in issue_assignees:
if (
str(issue.created_by_id) != str(actor_id)
and uuid.UUID(actor_id) not in issue_assignees
):
_ = IssueSubscriber.objects.get_or_create(
project_id=project_id, issue_id=issue_id, subscriber_id=actor_id
project_id=project_id,
issue_id=issue_id,
subscriber_id=actor_id,
)
except Exception as e:
pass
project = Project.objects.get(pk=project_id)
issue_subscribers = list(set(issue_subscribers + issue_assignees) - {uuid.UUID(actor_id)})
issue_subscribers = list(
set(issue_subscribers + issue_assignees) - {uuid.UUID(actor_id)}
)
for subscriber in issue_subscribers:
if subscriber in issue_subscribers:
sender = "in_app:issue_activities:subscribed"
if issue.created_by_id is not None and subscriber == issue.created_by_id:
if (
issue.created_by_id is not None
and subscriber == issue.created_by_id
):
sender = "in_app:issue_activities:created"
if subscriber in issue_assignees:
sender = "in_app:issue_activities:assigned"
for issue_activity in issue_activities_created:
# Do not send notification for description update
if issue_activity.get("field") == "description":
continue
issue_comment = issue_activity.get("issue_comment")
if issue_comment is not None:
issue_comment = IssueComment.objects.get(
id=issue_comment, issue_id=issue_id, project_id=project_id, workspace_id=project.workspace_id)
id=issue_comment,
issue_id=issue_id,
project_id=project_id,
workspace_id=project.workspace_id,
)
bulk_notifications.append(
Notification(
@@ -320,11 +392,16 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
"verb": str(issue_activity.get("verb")),
"field": str(issue_activity.get("field")),
"actor": str(issue_activity.get("actor_id")),
"new_value": str(issue_activity.get("new_value")),
"old_value": str(issue_activity.get("old_value")),
"new_value": str(
issue_activity.get("new_value")
),
"old_value": str(
issue_activity.get("old_value")
),
"issue_comment": str(
issue_comment.comment_stripped
if issue_activity.get("issue_comment") is not None
if issue_activity.get("issue_comment")
is not None
else ""
),
},
@@ -334,7 +411,8 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
# Add Mentioned as Issue Subscribers
IssueSubscriber.objects.bulk_create(
mention_subscribers + comment_mention_subscribers, batch_size=100)
mention_subscribers + comment_mention_subscribers, batch_size=100
)
last_activity = (
IssueActivity.objects.filter(issue_id=issue_id)
@@ -345,7 +423,7 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
actor = User.objects.get(pk=actor_id)
for mention_id in comment_mentions:
if (mention_id != actor_id):
if mention_id != actor_id:
for issue_activity in issue_activities_created:
notification = createMentionNotification(
project=project,
@@ -354,13 +432,12 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
actor_id=actor_id,
mention_id=mention_id,
issue_id=issue_id,
activity=issue_activity
activity=issue_activity,
)
bulk_notifications.append(notification)
for mention_id in new_mentions:
if (mention_id != actor_id):
if mention_id != actor_id:
if (
last_activity is not None
and last_activity.field == "description"
@@ -380,7 +457,9 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
"issue": {
"id": str(issue_id),
"name": str(issue.name),
"identifier": str(issue.project.identifier),
"identifier": str(
issue.project.identifier
),
"sequence_id": issue.sequence_id,
"state_name": issue.state.name,
"state_group": issue.state.group,
@@ -405,15 +484,17 @@ def notifications(type, issue_id, project_id, actor_id, subscriber, issue_activi
actor_id=actor_id,
mention_id=mention_id,
issue_id=issue_id,
activity=issue_activity
activity=issue_activity,
)
bulk_notifications.append(notification)
# save new mentions for the particular issue and remove the mentions that has been deleted from the description
update_mentions_for_issue(issue=issue, project=project, new_mentions=new_mentions,
removed_mention=removed_mention)
update_mentions_for_issue(
issue=issue,
project=project,
new_mentions=new_mentions,
removed_mention=removed_mention,
)
# Bulk create notifications
Notification.objects.bulk_create(bulk_notifications, batch_size=100)

View File

@@ -15,6 +15,7 @@ from sentry_sdk import capture_exception
from plane.db.models import Project, User, ProjectMemberInvite
from plane.license.utils.instance_value import get_email_configuration
@shared_task
def project_invitation(email, project_id, token, current_site, invitor):
try:

View File

@@ -189,7 +189,8 @@ def send_webhook(event, payload, kw, action, slug, bulk):
pk__in=[
str(event.get("issue")) for event in payload
]
).prefetch_related("issue_cycle", "issue_module"), many=True
).prefetch_related("issue_cycle", "issue_module"),
many=True,
).data
event = "issue"
action = "PATCH"
@@ -197,7 +198,9 @@ def send_webhook(event, payload, kw, action, slug, bulk):
event_data = [
get_model_data(
event=event,
event_id=payload.get("id") if isinstance(payload, dict) else None,
event_id=payload.get("id")
if isinstance(payload, dict)
else None,
many=False,
)
]

View File

@@ -36,7 +36,6 @@ def workspace_invitation(email, workspace_id, token, current_site, invitor):
# The complete url including the domain
abs_url = str(current_site) + relative_link
(
EMAIL_HOST,
EMAIL_HOST_USER,

Some files were not shown because too many files have changed in this diff Show More