Compare commits

...

144 Commits

Author SHA1 Message Date
pablohashescobar
cdc1a81bd6 fix: move paginator to different files add count queryset for counting 2024-07-21 20:04:13 +05:30
sriram veeraghanta
8f9b568a65 fix: adding new validation to change page is available before proceeding with update (#5176) 2024-07-19 17:44:45 +05:30
sriram veeraghanta
a6d111f66d fix: setry profiling default value to zero 2024-07-19 17:30:58 +05:30
guru_sainath
f1f7fa907a [WEB-1883] chore: moving issue activity store to respective folder (#5169)
* chore: issue activity store

* chore: updated issue activity store and handled workspace settings order

* chore: added paramenter on the issue worklog component

* chore: hanlded popover close from prop
2024-07-19 16:11:25 +05:30
Bavisetti Narayan
b4feaf973a chore: added details in cycle detail endpoint (#5132) 2024-07-19 15:56:54 +05:30
Bavisetti Narayan
39a607ac0a [WEB-1985] chore: page access control (#5154)
* chore: page access control

* chore: page access update endpoint updated

---------

Co-authored-by: Anmol Singh Bhatia <anmolsinghbhatia@plane.so>
2024-07-19 15:43:01 +05:30
Bavisetti Narayan
d3c3d3c5ab chore: changed the naming convention (#5171) 2024-07-19 15:40:53 +05:30
Nikhil
065c9779bb [WEB - 1998] fix: profile creation on user signup (#5168)
* fix: profile creation while sign in up

* dev: destructure tupple for get or create
2024-07-19 15:35:28 +05:30
Aaryan Khandelwal
cb21dcbcef fix: disable editor history conditionally (#5133) 2024-07-19 15:31:22 +05:30
Akshita Goyal
e7948eabf2 [WEB-1956] fix: Keyboard shortcuts (#5134)
* fix: shortcuts

* fix: naming

* fix: structure optimization
2024-07-19 15:28:48 +05:30
Anmol Singh Bhatia
c2b5464e40 fix: empty issue title indicator (#5173) 2024-07-19 15:12:59 +05:30
Anmol Singh Bhatia
e055abb711 fix: issue link edit modal and mutation fix (#5172) 2024-07-19 13:56:36 +05:30
Prateek Shourya
44a0ff5c67 [WEB-1995] fix: searched page redirection from command palette. (#5170)
* [WEB-1995] fix: searched page redirection from command palette.

* chore: update redirect logic.
2024-07-19 13:56:16 +05:30
dependabot[bot]
075b8efa99 chore(deps): bump sentry-sdk in /apiserver/requirements (#5165)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from 2.0.1 to 2.8.0.
- [Release notes](https://github.com/getsentry/sentry-python/releases)
- [Changelog](https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md)
- [Commits](https://github.com/getsentry/sentry-python/compare/2.0.1...2.8.0)

---
updated-dependencies:
- dependency-name: sentry-sdk
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-19 01:52:16 +05:30
Akshita Goyal
f27c25821c [WEB-1959]: Fix/member page revamp (#5163)
* fix: css issue + no pending issues handled

* fix: type issues

* fix: css changes
2024-07-18 20:50:25 +05:30
Anmol Singh Bhatia
aade07b37a fix: project intake disabled validation (#5161) 2024-07-18 18:19:11 +05:30
guru_sainath
8107045d8c fix: issue activity render enum keys update (#5162) 2024-07-18 18:18:32 +05:30
Akshita Goyal
4ce255a872 [WEB-1918]: Fix/sidebar button issue (#5160)
* fix: sidebar collapsed on smaller screen by default

* fix: linting

* fix: export issue

* fix: button action fixed
2024-07-18 17:00:33 +05:30
Akshita Goyal
a8c1b8cdef fix: inbox dependency array fix (#5159) 2024-07-18 16:00:16 +05:30
Akshita Goyal
78dd15a801 [WEB-1918]: Fix/sidebar collapse issue (#5157)
* fix: sidebar collapsed on smaller screen by default

* fix: linting

* fix: export issue
2024-07-18 15:52:48 +05:30
Aaryan Khandelwal
2d434f0b9c fix: disable selection if no issues are present (#5158) 2024-07-18 15:50:22 +05:30
Aaryan Khandelwal
209b700fd9 fix: page breadcrumb tooltip persistence (#5137) 2024-07-18 15:49:25 +05:30
Anmol Singh Bhatia
39e3c28ad8 [WEB-1981] chore: project view icon and empty state (#5153)
* chore: view icon updated

* chore: view asset updated

* chore: project view empty state updated
2024-07-18 15:46:16 +05:30
Prateek Shourya
cfc70622d6 [WEB-1960]: chore: upgrade to plane paid plans modal. (#5149) 2024-07-18 15:45:37 +05:30
Prateek Shourya
281948c1ce [WEB-1984] fix: code block padding and margin in pages. (#5152) 2024-07-18 15:28:40 +05:30
guru_sainath
2554110397 fix: build errors ee (#5156) 2024-07-18 14:59:28 +05:30
guru_sainath
482b363045 [WEB-1883] chore: moved workspace settings to respective folders for CE and EE (#5151)
* chore: moved workspace settings to respective folders for ce and ee

* chore: updated imports

* chore: updated imports for ee

* chore: resolved import error

* chore: resolved import error

* chore: ee imports in the issue sidebar

* chore: updated file structure

* chore: table UI

* chore: resolved build errors

* chore: added worklog on issue peekoverview
2024-07-18 14:45:30 +05:30
Akshita Goyal
fff27c60e4 [WEB-1959]: Chore/settings member page (#5144)
* chore: implemented table component in ui library

* chore: added export in the UI package

* chore/member-page-revamp

* fix: added custom popover className

* fix: updated ui for projects

* fix: hide pending invites for members

* fix: added ee component

* removed unwanted logging

* fix: seperated components

* fix: used collapsible instead of disclosure

* fix: removed commented code

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-07-18 13:02:22 +05:30
Akshita Goyal
474d7ef3c0 [WEB-1948] fix: priority icons (#5146)
* fix: priority icons

* fix: icon size specified
2024-07-18 13:01:48 +05:30
Anmol Singh Bhatia
a7ecfade98 [WEB-1920] dev: app sidebar revamp (#5150)
* chore: user activity icon added

* dev: sidebar navigation component added

* chore: dashboard constant file updated

* chore: unread notification indicator position

* chore: app sidebar project section

* chore: app sidebar User and Workspace section updated

* chore: notification to inbox transition

* chore: code refactor

* chore: code refactor
2024-07-18 12:56:33 +05:30
Akshita Goyal
996192b9bf fix: showing first issue as default inbox state (#5147) 2024-07-17 18:46:40 +05:30
sriram veeraghanta
4cb02a9270 fix: removing swr refresh intervel from the global config 2024-07-17 13:55:06 +05:30
Anmol Singh Bhatia
85719b9a12 chore: project intake toast and activity message updated (#5143) 2024-07-16 18:44:32 +05:30
Anmol Singh Bhatia
0b1f9f0e5b fix: spreadsheet layout quick action event propagation (#5141) 2024-07-16 16:06:19 +05:30
Anmol Singh Bhatia
d042dac042 fix: issue export project select dropdown width and truncate fix (#5138) 2024-07-16 15:58:55 +05:30
Anmol Singh Bhatia
f2733ab4df fix: issue detail widget user role permission added (#5131) 2024-07-16 15:57:14 +05:30
Anmol Singh Bhatia
5464e62a03 [WEB-1962] fix: disabled custom menu (#5130)
* fix: custom menu disabled button

* Add constants package to package.json

* Freeze hook form version

---------

Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
2024-07-16 15:22:12 +05:30
Anmol Singh Bhatia
e4d6e5e1af [WEB-1730] chore: project intake (#5140)
* chore: intake icon added

* chore: project inbox updated to intake in app sidebar and feature settings

* chore: intake icon added

* chore: project intake

* chore: project intake empty state asset updated
2024-07-16 15:21:03 +05:30
guru_sainath
cd85a9fe09 chore: handled the auto form submit for all authenticators (#5139) 2024-07-16 14:16:10 +05:30
Nikhil
6ade86f89d dev: rename user display configuration model (#5119)
* dev: rename model

* dev: add fields to project and issue types
2024-07-16 13:51:28 +05:30
Prateek Shourya
65caaa14cd [WEB-1957] fix: exception error on label creation for unauthorized users. (#5127) 2024-07-15 20:29:20 +05:30
dependabot[bot]
0e92cae05f chore(deps): bump django in /apiserver/requirements (#5128)
Bumps [django](https://github.com/django/django) from 4.2.11 to 4.2.14.
- [Commits](https://github.com/django/django/compare/4.2.11...4.2.14)

---
updated-dependencies:
- dependency-name: django
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-07-15 20:24:53 +05:30
Anmol Singh Bhatia
9523799f34 fix: isssue widgets collapsible ui (#5126) 2024-07-15 19:51:59 +05:30
guru_sainath
f5f3c4915f [WEB-1846] chore: integrated project other features enabled/disabled feature on project settings and updated the pro icon as a component (#5071)
* chore: integrated time traking enabled/disabled feature on project settings and updated the pro icon as a component

* chore: Showing the toggle and disabled to make any operations on project features

* chore: default exports in constants

* chore: seperated isEnabled and isPro

* chore: updated time traking key

* chore: updated UI in project feature settings
2024-07-15 19:48:27 +05:30
rahulramesha
08d9e95a86 [WEB-1255] chore: Refactor existing Space app for project publish (#5107)
* chore: paginated the issues in space app

* chore: storing query using filters

* chore: added filters for priority

* chore: issue view model save function

* chore: votes and reactions added in issues endpoint

* chore: added filters in the public endpoint

* chore: issue detail endpoint

* chore: added labels, modules and assignees

* refactor existing project publish in space app

* fix clear all filters in space App

* chore: removed the extra serialier

* remove optional chaining and fallback to an empty array

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-07-15 18:35:45 +05:30
guru_sainath
22671ec8a7 [WEB-1954] chore: implemented table component in ui library (#5125)
* chore: implemented table component in ui library

* chore: added export in the UI package
2024-07-15 18:34:16 +05:30
Prateek Shourya
56331a7b55 [WEB-1949] chore: delete workspace components restructuring. (#5123)
* [WEB-1949] chore: delete workspace components restructuring.

* chore: update delete workspace section.
2024-07-15 16:55:26 +05:30
Akshita Goyal
33d6a8d233 [WEB-1820]: Fix/project selection dropdown (#5122)
* fix: Truncated project name under custom analytics's project selection dropdown

* fix: project name truncated

* fix: removed static width

* fix: hardcoded width

* fix: css

* fix: handled the custom search button

* Freeze hookform version

* Revert yarn lock

---------

Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
2024-07-15 16:47:40 +05:30
Akshita Goyal
4c353b6eeb [WEB-1557] fix: create issue disabled for guests (#5102)
* fix: create issue disabled for guests

* fix: workspace role type

* fix: create modal issue

* fix: removed the create action in guest mode

* Remove unused imports

---------

Co-authored-by: Satish Gandham <satish.iitg@gmail.com>
2024-07-15 14:18:57 +05:30
Bavisetti Narayan
0cc5a5357b fix: added atomic transactions (#5109) 2024-07-15 14:06:50 +05:30
Anmol Singh Bhatia
e758e08785 chore: space app issue comment placeholder updated (#5121) 2024-07-15 13:35:06 +05:30
Anmol Singh Bhatia
890888a274 chore: export issues validation added (#5118) 2024-07-15 13:27:20 +05:30
Anmol Singh Bhatia
f7de9a3497 fix: module and cycle sidebar scroll (#5113) 2024-07-15 13:26:28 +05:30
Anmol Singh Bhatia
830d1c0b5a fix: issue label activity truncate fix and chip component added (#5120) 2024-07-15 13:13:20 +05:30
rahulramesha
4b0946e093 fix sort order for workspace views (#5112) 2024-07-12 19:53:11 +05:30
Anmol Singh Bhatia
1a26768291 [WEB-1902] fix: last draft issue properties (#5110)
* fix: handleFormChange added to start and due date properties

* fix: useEffect added to update localStorage when isDirty changes
2024-07-12 19:49:23 +05:30
Anmol Singh Bhatia
c93b826c48 chore: updated the order of issue detail widgets in the peek overview (#5111) 2024-07-12 19:46:00 +05:30
Prateek Shourya
ce89c7dcff [WEB-1929] chore: improve finishOnboarding logic to handle case where user profile setup is done and user already has a workspace. (#5105) 2024-07-12 17:17:58 +05:30
Akshita Goyal
f06095f120 [WEB-1759] fix: project dropdown action (#5088)
* fix: project dropdown action

* chore: added redirection for collapsed sidebar

* fix: disclosure panel close issue

* fix: removed redundancy

* fix: truncate issue
2024-07-11 20:20:32 +05:30
Anmol Singh Bhatia
dd3b0f6a3f [WEB-1921] fix: issue widgets modal and code refactor (#5106)
* fix: celery fix

* chore: issue relationkey and issueCrudOperation state added to issueDetail store

* chore: moved issue detail widget modal to root

* chore: code refactor

* chore: default open widget updated
2024-07-11 20:12:09 +05:30
Bavisetti Narayan
24973c1386 [WEB-1909] chore: removed duplication of assignee and label activity (#5095)
* chore: removed duplication of assignee and label activity

* chore: removed the print statement

* chore: updated the queryset
2024-07-11 14:43:02 +05:30
Anmol Singh Bhatia
15b0a448ee [WEB-1925] dev: issue detail widget enhancement (#5101)
* chore: collapsible button border color updated

* chore: TIssueDetailWidget type added

* chore: issue link modal onClose updated

* chore: issue detail widgets collapse state added to store

* chore: issue detail widget interaction added

* chore: issue detail widget interaction added
2024-07-11 14:34:56 +05:30
Nikhil
4d484577b5 dev: fix page versioning task (#5104) 2024-07-11 13:47:32 +05:30
Akshita Goyal
2d78f6fd22 fix: empty state for view page fixed (#5090) 2024-07-11 13:37:40 +05:30
Akshita Goyal
77694ee8ba [WEB-1876] fix: "Show sub-issues" checkbox checked by default under Archives (#5091)
* fix: "Show sub-issues" checkbox checked by default under Archives

* fix: default value set
2024-07-11 13:37:07 +05:30
Akshita Goyal
ac8e588ac3 [WEB-1820] fix: analytics truncate project name (#5089)
* fix: Truncated project name under custom analytics's project selection dropdown

* fix: project name truncated

* fix: removed static width

* fix: hardcoded width

* fix: css
2024-07-11 13:36:34 +05:30
guru_sainath
2136872351 [WEB-1916] ui: updated the empty state design in workspace notifications and ui changes (#5093)
* ui: updated the empty state design in workspace notifications and ui changes

* chore: updated the popover custom components

* ui: updated the badge ui on the sidrbar options

* ui: broken down the menu components
2024-07-11 13:19:07 +05:30
Anmol Singh Bhatia
a90724516b chore: auth screen layout padding (#5087) 2024-07-11 13:18:06 +05:30
Prateek Shourya
31f67e189d [WEB-1843] chore: billing page and upgrade badge UI improvements. (#5099)
* [WEB-1843] chore: billing page and upgrade badge UI improvements.

* chore: fix sidebar collaped state.
2024-07-10 19:38:21 +05:30
Nikhil
c6db050443 chore: page version migrations (#5103)
* chore: rewrite page version migration to remove data back migration

* dev: rename exporter history choice field

* dev: update migration
2024-07-10 19:37:04 +05:30
Nikhil
f9a3778c7f fix: data migrations for page versioning (#5100)
* dev: remove issue type back migrations

* dev: revert data migrations

* dev: update migrations to run async

* dev: remove unused imports
2024-07-10 15:03:41 +05:30
Nikhil
ec1662cbd6 dev: remove page version back migrations (#5092) 2024-07-09 19:58:49 +05:30
Nikhil
7986a28ca2 [WEB - 1837]feat: page versioning (#5019)
* dev: create issue types and add back migration for existing issues

* dev: fix save

* dev: fix migration for issue types

* dev: create page version

* dev: add page versioning migrations

* dev: create page version endpoints

* dev: add is_default value in issue type

* dev: add start date and target date to project

* chore: updated migration

* dev: get issue_types

* fix: typo

* dev: update fetch ordering

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-07-09 18:55:35 +05:30
guru_sainath
cd540e9641 [WEB-1908] chore: update input type number validation to type text in estimate input form (#5086)
* chore: removed input type number validation in estimate input form

* chore: removed pattern
2024-07-09 15:31:12 +05:30
Anmol Singh Bhatia
676ec7e396 [WEB-1899] fix: issue attachment delete modal and code refactor (#5085)
* chore: issue attachment modal state updated in store

* fix: issue attachment delete modal fix and code refactor
2024-07-09 15:14:23 +05:30
Bavisetti Narayan
6b12c78cea [WEB-1904] chore: updated setup env (#5082)
* chore: updated setup env

* chore: removed the web env
2024-07-09 13:48:36 +05:30
guru_sainath
f617937542 [WEB-1900] chore: mentions mutation, ui fix on app sidebar notification badge, and back button inbox issue notification embed (#5083)
* chore: mention notification boolean field

* chore: handled mentions and all notification mutation and UI fix on the app sidebar notification badge and Back redirection button on inbox issue resposiveness

* chore: Moved everthing to chip

* chore: cleaning up the selection when we unmount the page

* chore: resolved build error

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-07-09 13:41:34 +05:30
Nikhil
988201d729 [WEB - 1888] dev: log issue activity when transferring issues from one cycle to another (#5073)
* fix: cycle transfer activity

* chore: external api transfer issue

* chore: moved the cycle id

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-07-08 20:54:10 +05:30
Anmol Singh Bhatia
6c2b28df91 fix: attachment delete modal (#5080) 2024-07-08 20:53:35 +05:30
Anmol Singh Bhatia
53e5d4b40c [WEB-1680] dev: issue detail activity revamp and issue detail page improvement (#5075)
* chore: issue link activity message updated

* chore: activity filter type constant added

* dev: issue activity revamp and code refactor

* chore: issue detail widget oreder updated in peek overview

* chore: issue detail page padding improvement

* fix: relation widget toast alert

* fix: relation widget toast alert

* fix: peek overview attachment delete modal

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: code refactor

* chore: issue detail sidebar parent field
2024-07-08 19:33:19 +05:30
Anmol Singh Bhatia
fd61079c8b chore: project active cycle progress state group color updated (#5077) 2024-07-08 18:52:47 +05:30
guru_sainath
7767be2e21 [WEB-1889] fix: handled tapping on a notification in Notifications from mobile in inbox issue and issue peek overview component (#5074)
* fix: handled tapping on a notification in Notifications from mobile in inbox issue and issue peekoverview component

* fix: code cleanup

* fix: code cleanup on workspace notification store

* fix: updated selected notification on workspace notification store
2024-07-08 18:52:30 +05:30
rahulramesha
a623456e63 [WEB-1890] fix: issue creation by using appropriate stores (#5072)
* fix issue creation by using appropriate stores

* add comments

* change useIssuesStore hook to use useIssueStoreType
2024-07-08 18:52:10 +05:30
Anmol Singh Bhatia
fb586c58d2 fix: created by tooltip removed (#5076) 2024-07-08 17:43:20 +05:30
Anmol Singh Bhatia
fb46249ccf [WEB-1872] fix: completed cycle gantt layout quick add issue validation (#5066)
* fix: completed cycle gantt layout quick add issue validation

* chore: code refactor
2024-07-08 15:55:53 +05:30
Anmol Singh Bhatia
0e4ce2baa5 chore: issue widgets added to issue peek overview (#5069) 2024-07-08 15:51:59 +05:30
Anmol Singh Bhatia
4e815c0fed fix: issue link error toast alert (#5068) 2024-07-08 15:28:38 +05:30
Nikhil
1cd55cd95b fix: remove user workspace cache on account deactivation (#5065) 2024-07-08 15:27:45 +05:30
Satish Gandham
d8d476463b [WEB-1728] Chore: Preload apis required to bootstrap the application (#5026)
* chore: prefetch apis

* chore: implemented cache-control

* Preload links with credentials

* chore: updated time in the cache and handled it based on cookie

* chore: make cache private

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-07-08 15:26:52 +05:30
guru_sainath
fc2585bf64 [WEB-1878] ui: implementing popover component in a UI package (#5063)
* ui: impleented popover commponent

* chore: implemente component in project-states

* chore: added default styling for popover menu panel

* chore: removed propsWithChildren in popover component type
2024-07-08 15:03:22 +05:30
Anmol Singh Bhatia
6dcbea6d14 fix: module detail sidebar scroll (#5064) 2024-07-08 12:59:57 +05:30
Anmol Singh Bhatia
12c24ad255 fix: dashboard collaborators endpoint active member filter (#5062) 2024-07-08 12:04:00 +05:30
Anmol Singh Bhatia
387dbd89f5 [WEB-1679] feat: issue detail widgets (#5034)
* chore: issue detail sidebar and main content improvement and code refactor

* dev: issue relation list component added

* chore: code refactor

* dev: issue detail widget implementation

* dev: update issue relation endpoint to return same response as sub issue

* chore: changed updated_by in issue attachment

* fix: peek view link ui

* chore: move collapsible button component to plane ui package

* chore: issue list component code refactor

* chore: relation icon updated

* chore: relation icon updated

* chore: issue quick action ui updated

* chore: wrap title indicatorElement component with useMemo

* chore: code refactor

* fix: build error
2024-07-05 16:51:58 +05:30
Manish Gupta
b7d792ed07 fix: Fixed aio build (#5056)
* fix: aio build

* fix aio branch build
2024-07-05 16:45:02 +05:30
Bavisetti Narayan
54a5e5e761 [WEB-1437] feat: notifications mention filter (#5040)
* chore: implemented mentions on the notification

* chore: mention notification filter

* chore: handled mentions refetch and total count on header and sidebar menu option

* chore: seperated notifications empty state

* chore: updated sidebar menu option notification vaidation

* chore: handled notificaition sidebar total notifications count

---------

Co-authored-by: gurusainath <gurusainath007@gmail.com>
2024-07-05 16:13:09 +05:30
Anmol Singh Bhatia
837f09ed90 fix: kanban subgroup quick add validation (#5055) 2024-07-05 16:10:16 +05:30
guru_sainath
38f8aa90c1 [WEB-1519] chore: update component structure in project state settings and implement DND (#5043)
* chore: updated project settings state

* chore: updated sorting on project state

* chore: updated grab handler in state item

* chore: Updated UI and added garb handler icon

* chore: handled top and bottom sequence in middle element swap

* chore: handled input state element char limit to 100

* chore: typos and code cleanup in create state

* chore: handled typos and comments wherever is required

* chore: handled sorting logic
2024-07-05 16:09:33 +05:30
guru_sainath
c75091ca3a [WEB-1803] chore: workspace notification sorting when we refresh the notifications on workspace notifications (#5049)
* chore: workspace notification sorting when we refresh the notifications on workspace notifications

* chore: replaced sorting with ISO to EPOCH

* chore: converted obj to array
2024-07-05 16:09:16 +05:30
Nikhil
61ce055cb3 [WEB - 1740] chore: add issue id in pages detail endpoint (#4942)
* chore: add issue id in pages detail endpoint

* fix: response structure changed

---------

Co-authored-by: sriram veeraghanta <veeraghanta.sriram@gmail.com>
2024-07-05 15:00:15 +05:30
Prateek Shourya
977b47d35f [WEB-1843] chore: minor file restructuring. (#5044) 2024-07-05 14:58:42 +05:30
Bavisetti Narayan
509c258b07 chore: custom analytics charts (#5052) 2024-07-05 14:55:48 +05:30
Vihar Kurama
3cfb0ac518 update assignes for github issue templates (#5053) 2024-07-05 14:30:37 +05:30
guru_sainath
156f1011f3 chore: Active cycle muatation when current cycle creation when current_date is in between start_date and end_date (#5050) 2024-07-05 12:13:28 +05:30
Bavisetti Narayan
a36d1a753e chore: corrected the subissue ordering (#5030) 2024-07-04 19:51:05 +05:30
Bavisetti Narayan
9a927ded84 chore: estimate points float field (#5038) 2024-07-04 19:50:08 +05:30
rahulramesha
72f00e378d fix build errors due to conflicting PRs (#5047) 2024-07-04 17:13:04 +05:30
guru_sainath
d3ec1aa422 [WEB-1792] chore: updated UI notification tooltip on header and app sidebar (#5046)
* ui: updatedm UI notificaton tooltip on header and app sidebar

* fix: reverted notification sorting

* fix: renamed function name updateIssue with issueUpdate in issue base helper file
2024-07-04 17:04:00 +05:30
Anmol Singh Bhatia
dda19b0a3f fix: issue peek view link list ui (#5045) 2024-07-04 17:03:25 +05:30
Aaryan Khandelwal
2b570da890 chore: change comment box placeholder (#5042) 2024-07-04 16:21:05 +05:30
rahulramesha
5918607171 [WEB-1818] fix: issue changes done in Peek overview to reflect in the issue list boards (#5010)
* fix issue changes done in Peek overview to reflect in the issue boards

* Adding comments to aliased names
2024-07-04 16:18:33 +05:30
rahulramesha
f1496e3144 fix module Quick add (#5039) 2024-07-04 16:17:49 +05:30
sriram veeraghanta
9717497b4e chore: package version change 2024-07-04 15:27:37 +05:30
guru_sainath
2f8c8ac40f [WEB-1847] chore: handled project id on the issue creation modal toast when the issue creation is happening on the home page (#5033)
* chore: handled project id on the issue creation modal toast when the issue creation is happening on the home page

* chore: updated issue modal condition
2024-07-04 12:08:26 +05:30
guru_sainath
734e920e08 chore: updated dropdown hover ui on estimate dropdown menu options (#5032) 2024-07-03 19:17:01 +05:30
Prateek Shourya
82fa1347d1 [WEB-1843] chore: active cycles file restructuring. (#5031) 2024-07-03 19:14:15 +05:30
rahulramesha
a7aa5c2ba7 [WEB-1829] fix: stop issues query before fetching the view filters (#5015)
* fix to not query issues before fetching the view filters

* add default group by for Kanban views
2024-07-03 16:25:41 +05:30
Nikhil
825b2f26bf fix: issue pagination listing (#5029) 2024-07-03 15:56:12 +05:30
Anmol Singh Bhatia
af51992eba [WEB-1715] chore: issue filters indicator enhancement (#5027)
* chore: isIssueFilterActive helper function added

* chore: isIssueFilterActive implementation

* chore: code refactor
2024-07-03 15:43:22 +05:30
Anmol Singh Bhatia
8f59a36bda fix: sidebar quick action overlapping (#5020) 2024-07-03 13:49:28 +05:30
Manish Gupta
5ad0114aac fix:feature preview (#4976)
* custom base tags

* fixed aio actions
2024-07-03 13:30:25 +05:30
Nikhil
095639b976 fix: workspace slug validation (#5023) 2024-07-03 13:19:57 +05:30
Prateek Shourya
db722d580f [WEB-1801] fix: avoid opening shortcut guide on editors / input elements. (#5025) 2024-07-03 13:11:51 +05:30
Anmol Singh Bhatia
0363057d9c fix: modal core onClose (#5018) 2024-07-03 11:34:52 +05:30
Nikhil
3dc933f0e8 [WEB - 1835] chore: update workspace constants (#5017)
* dev: add workspace restriction list in constants

* dev: update list
2024-07-02 20:05:35 +05:30
guru_sainath
fc33238d89 fix: updated unread count UI and validation (#5016) 2024-07-02 19:43:22 +05:30
Aaryan Khandelwal
86464c1d6f fix: project publish redirection (#5004) 2024-07-02 19:28:55 +05:30
Prateek Shourya
cc479f39a7 [WEB-1808] style: fix settings highlight on app sidebar. (#4995) 2024-07-02 19:28:02 +05:30
Aaryan Khandelwal
b50df9ef99 fix: peek overview layout dropdown icons (#4993) 2024-07-02 19:27:13 +05:30
rahulramesha
c637639a3e tweak pagination and virtualization thresholds to have a smoother scroll (#4991) 2024-07-02 19:26:26 +05:30
M. Palanikannan
12401c54cc fix: update link view now updates the link (#4989) 2024-07-02 19:25:32 +05:30
Anmol Singh Bhatia
1201a4245e [WEB-1679] chore: sub-issues, attachments, and links UI revamp (#5007)
* chore: issue attachment ui revamp

* chore: issue link ui revamp

* chore: attachment icon improvement

* chore: sub-issue ui revamp

* chore: open on hover functionality added to custom menu

* chore: code refactor
2024-07-02 19:06:20 +05:30
guru_sainath
fc15ca5565 [WEB-1792] fix: handled redirection issue when we change the status of the inbox issue in the notification (#5014)
* fix: handled redirection issue when we change the status of inbox issue in notification

* fix: updated condition
2024-07-02 18:49:40 +05:30
Quadrubo
4e8b7e6dbb fix: api requests cors (#4929) 2024-07-02 18:09:14 +05:30
Anmol Singh Bhatia
b0bc818362 [WEB-1819] dev: collapsible component (#5001)
* dev: accordion component added

* chore: code refactor

* chore: collapsible component improvement
2024-07-02 17:10:53 +05:30
Nikhil
c8491a13b3 [WEB - 1827]remove: migration for account and social login connection (#5013)
* fix: OAuth adapter error codes + missing account provider migration that introduces GitLab (#4998)

* feat(apiserver): GitLab OAuth client

* feat(admin,packages,space,web): GitLab OAuth client

* Feat(apiserver/oauth): authentication_error_code()

* chore: remove empty files introduced by rebase

* dev: delete migration

---------

Co-authored-by: jon ⚝ <jon@allmende.io>
2024-07-02 17:06:22 +05:30
guru_sainath
83587c2c6b fix: converted and handled the estimate type to lowercase in the store (#5011) 2024-07-02 16:35:06 +05:30
rahulramesha
d9d62c2d5a Add empty state when view is not available (#5002) 2024-07-02 16:21:28 +05:30
Anmol Singh Bhatia
b591203da6 [WEB-1679] chore: relation and attachment icons (#5005)
* chore: attachment icon improvement

* chore: relation and dropdown icon added

* chore: code refactor
2024-07-02 16:15:42 +05:30
Prateek Shourya
78e0405971 [WEB-1801] improvement: open shortcut guide using shift+/ key combination. (#5000) 2024-07-02 16:14:42 +05:30
guru_sainath
26040144fc [WEB-1792] chore: integrated inbox issue in notification peek view and handled increment/decrement of unread notifications (#5008)
* chore: added a boolean field in notification list

* chore: notification filters changed

* chore: handled inbox notification and typo on the card items

* chore: handled notification count increment and decrement

* chore: typos and ui updates

---------

Co-authored-by: NarayanBavisetti <narayan3119@gmail.com>
2024-07-02 16:12:27 +05:30
Prateek Shourya
0fd36257d7 [WEB-1814] chore: update OIDC and SAML logo. (#5009)
* [WEB-1814] chore: update OIDC and SAML logo.

* chore: update OIDC logo.
2024-07-02 16:11:57 +05:30
Anmol Singh Bhatia
c217185b07 chore: open on hover functionality added to custom menu (#5003) 2024-07-02 14:53:17 +05:30
Anmol Singh Bhatia
a5628c4ce1 fix: issue link count mutation (#5006) 2024-07-02 14:53:05 +05:30
Prateek Shourya
764e08140c [WEB-1814] chore: admin app UI & UX improvements. (#4999) 2024-07-02 12:58:45 +05:30
554 changed files with 12403 additions and 5151 deletions

View File

@@ -2,7 +2,7 @@ name: Bug report
description: Create a bug report to help us improve Plane
title: "[bug]: "
labels: [🐛bug]
assignees: [srinivaspendem, pushya22]
assignees: [vihar, pushya22]
body:
- type: markdown
attributes:

View File

@@ -2,7 +2,7 @@ name: Feature request
description: Suggest a feature to improve Plane
title: "[feature]: "
labels: [✨feature]
assignees: [srinivaspendem, pushya22]
assignees: [vihar, pushya22]
body:
- type: markdown
attributes:

View File

@@ -2,6 +2,11 @@ name: Build AIO Base Image
on:
workflow_dispatch:
inputs:
base_tag_name:
description: 'Base Tag Name'
required: false
default: ''
env:
TARGET_BRANCH: ${{ github.ref_name }}
@@ -22,27 +27,29 @@ jobs:
- id: set_env_variables
name: Set Environment Variables
run: |
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
if [ "${{ github.event.inputs.base_tag_name }}" != "" ]; then
echo "IMAGE_TAG=${{ github.event.inputs.base_tag_name }}" >> $GITHUB_OUTPUT
elif [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
echo "IMAGE_TAG=latest" >> $GITHUB_OUTPUT
elif [ "${{ env.TARGET_BRANCH }}" == "preview" ]; then
echo "IMAGE_TAG=preview" >> $GITHUB_OUTPUT
else
echo "IMAGE_TAG=develop" >> $GITHUB_OUTPUT
fi
if [ "${{ env.TARGET_BRANCH }}" == "master" ]; then
echo "BUILDX_DRIVER=cloud" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=lab:latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64,linux/arm64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=makeplane/plane-dev" >> $GITHUB_OUTPUT
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT
echo "IMAGE_TAG=latest" >> $GITHUB_OUTPUT
else
echo "BUILDX_DRIVER=docker-container" >> $GITHUB_OUTPUT
echo "BUILDX_VERSION=latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
if [ "${{ env.TARGET_BRANCH }}" == "preview" ]; then
echo "TARGET_BRANCH=preview" >> $GITHUB_OUTPUT
echo "IMAGE_TAG=preview" >> $GITHUB_OUTPUT
else
echo "TARGET_BRANCH=develop" >> $GITHUB_OUTPUT
echo "IMAGE_TAG=develop" >> $GITHUB_OUTPUT
fi
fi
- id: checkout_files
@@ -54,7 +61,6 @@ jobs:
needs: [base_build_setup]
env:
BASE_IMG_TAG: makeplane/plane-aio-base:full-${{ needs.base_build_setup.outputs.image_tag }}
TARGET_BRANCH: ${{ needs.base_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.base_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.base_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.base_build_setup.outputs.gh_buildx_platforms }}
@@ -96,7 +102,6 @@ jobs:
needs: [base_build_setup]
env:
BASE_IMG_TAG: makeplane/plane-aio-base:slim-${{ needs.base_build_setup.outputs.image_tag }}
TARGET_BRANCH: ${{ needs.base_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.base_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.base_build_setup.outputs.gh_buildx_version }}
BUILDX_PLATFORMS: ${{ needs.base_build_setup.outputs.gh_buildx_platforms }}

View File

@@ -13,6 +13,10 @@ on:
type: boolean
required: false
default: false
base_tag_name:
description: 'Base Tag Name'
required: false
default: ''
release:
types: [released, prereleased]
@@ -27,6 +31,7 @@ jobs:
runs-on: ubuntu-latest
outputs:
gh_branch_name: ${{ steps.set_env_variables.outputs.TARGET_BRANCH }}
flat_branch_name: ${{ steps.set_env_variables.outputs.FLAT_BRANCH_NAME }}
gh_buildx_driver: ${{ steps.set_env_variables.outputs.BUILDX_DRIVER }}
gh_buildx_version: ${{ steps.set_env_variables.outputs.BUILDX_VERSION }}
gh_buildx_platforms: ${{ steps.set_env_variables.outputs.BUILDX_PLATFORMS }}
@@ -52,7 +57,9 @@ jobs:
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
if [ "${{ env.TARGET_BRANCH }}" == "preview" ]; then
if [ "${{ github.event_name}}" == "workflow_dispatch" ] && [ "${{ github.event.inputs.base_tag_name }}" != "" ]; then
echo "AIO_BASE_TAG=${{ github.event.inputs.base_tag_name }}" >> $GITHUB_OUTPUT
elif [ "${{ env.TARGET_BRANCH }}" == "preview" ]; then
echo "AIO_BASE_TAG=preview" >> $GITHUB_OUTPUT
else
echo "AIO_BASE_TAG=develop" >> $GITHUB_OUTPUT
@@ -72,6 +79,9 @@ jobs:
echo "DO_SLIM_BUILD=false" >> $GITHUB_OUTPUT
fi
FLAT_BRANCH_NAME=$(echo "${{ env.TARGET_BRANCH }}" | sed 's/[^a-zA-Z0-9]/-/g')
echo "FLAT_BRANCH_NAME=$FLAT_BRANCH_NAME" >> $GITHUB_OUTPUT
- id: checkout_files
name: Checkout Files
uses: actions/checkout@v4
@@ -83,7 +93,7 @@ jobs:
env:
BUILD_TYPE: full
AIO_BASE_TAG: ${{ needs.branch_build_setup.outputs.aio_base_tag }}
AIO_IMAGE_TAGS: makeplane/plane-aio:full-${{ needs.branch_build_setup.outputs.gh_branch_name }}
AIO_IMAGE_TAGS: makeplane/plane-aio:full-${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
@@ -126,7 +136,7 @@ jobs:
tags: ${{ env.AIO_IMAGE_TAGS }}
push: true
build-args: |
BUILD_TAG=${{ env.AIO_BASE_TAG }}
BASE_TAG=${{ env.AIO_BASE_TAG }}
BUILD_TYPE=${{env.BUILD_TYPE}}
cache-from: type=gha
cache-to: type=gha,mode=max
@@ -143,7 +153,7 @@ jobs:
env:
BUILD_TYPE: slim
AIO_BASE_TAG: ${{ needs.branch_build_setup.outputs.aio_base_tag }}
AIO_IMAGE_TAGS: makeplane/plane-aio:slim-${{ needs.branch_build_setup.outputs.gh_branch_name }}
AIO_IMAGE_TAGS: makeplane/plane-aio:slim-${{ needs.branch_build_setup.outputs.flat_branch_name }}
TARGET_BRANCH: ${{ needs.branch_build_setup.outputs.gh_branch_name }}
BUILDX_DRIVER: ${{ needs.branch_build_setup.outputs.gh_buildx_driver }}
BUILDX_VERSION: ${{ needs.branch_build_setup.outputs.gh_buildx_version }}
@@ -186,7 +196,7 @@ jobs:
tags: ${{ env.AIO_IMAGE_TAGS }}
push: true
build-args: |
BUILD_TAG=${{ env.AIO_BASE_TAG }}
BASE_TAG=${{ env.AIO_BASE_TAG }}
BUILD_TYPE=${{env.BUILD_TYPE}}
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@@ -2,6 +2,11 @@ name: Feature Preview
on:
workflow_dispatch:
inputs:
base_tag_name:
description: 'Base Tag Name'
required: false
default: 'preview'
env:
TARGET_BRANCH: ${{ github.ref_name }}
@@ -29,7 +34,12 @@ jobs:
echo "BUILDX_VERSION=latest" >> $GITHUB_OUTPUT
echo "BUILDX_PLATFORMS=linux/amd64" >> $GITHUB_OUTPUT
echo "BUILDX_ENDPOINT=" >> $GITHUB_OUTPUT
echo "AIO_BASE_TAG=develop" >> $GITHUB_OUTPUT
if [ "${{ github.event.inputs.base_tag_name }}" != "" ]; then
echo "AIO_BASE_TAG=${{ github.event.inputs.base_tag_name }}" >> $GITHUB_OUTPUT
else
echo "AIO_BASE_TAG=develop" >> $GITHUB_OUTPUT
fi
echo "TARGET_BRANCH=${{ env.TARGET_BRANCH }}" >> $GITHUB_OUTPUT

View File

@@ -1,6 +1,5 @@
# Environment Variables
Environment variables are distributed in various files. Please refer them carefully.
## {PROJECT_FOLDER}/.env
@@ -9,17 +8,13 @@ File is available in the project root folder
```
# Database Settings
PGUSER="plane"
PGPASSWORD="plane"
PGHOST="plane-db"
PGDATABASE="plane"
DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
POSTGRES_USER="plane"
POSTGRES_PASSWORD="plane"
POSTGRES_DB="plane"
PGDATA="/var/lib/postgresql/data"
# Redis Settings
REDIS_HOST="plane-redis"
REDIS_PORT="6379"
REDIS_URL="redis://${REDIS_HOST}:6379/"
# AWS Settings
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
@@ -29,63 +24,39 @@ AWS_S3_ENDPOINT_URL="http://plane-minio:9000"
AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
```
## {PROJECT_FOLDER}/web/.env.example
```
# Public boards deploy URL
NEXT_PUBLIC_DEPLOY_URL="http://localhost/spaces"
```
## {PROJECT_FOLDER}/apiserver/.env
```
# Backend
# Debug value for api server use it as 0 for production use
DEBUG=0
CORS_ALLOWED_ORIGINS="http://localhost"
# Error logs
SENTRY_DSN=""
SENTRY_ENVIRONMENT="development"
# Database Settings
PGUSER="plane"
PGPASSWORD="plane"
PGHOST="plane-db"
PGDATABASE="plane"
DATABASE_URL=postgresql://${PGUSER}:${PGPASSWORD}@${PGHOST}/${PGDATABASE}
POSTGRES_USER="plane"
POSTGRES_PASSWORD="plane"
POSTGRES_HOST="plane-db"
POSTGRES_DB="plane"
POSTGRES_PORT=5432
DATABASE_URL=postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
# Redis Settings
REDIS_HOST="plane-redis"
REDIS_PORT="6379"
REDIS_URL="redis://${REDIS_HOST}:6379/"
# Email Settings
EMAIL_HOST=""
EMAIL_HOST_USER=""
EMAIL_HOST_PASSWORD=""
EMAIL_PORT=587
EMAIL_FROM="Team Plane <team@mailer.plane.so>"
EMAIL_USE_TLS="1"
EMAIL_USE_SSL="0"
# AWS Settings
AWS_REGION=""
AWS_ACCESS_KEY_ID="access-key"
@@ -95,35 +66,25 @@ AWS_S3_ENDPOINT_URL="http://plane-minio:9000"
AWS_S3_BUCKET_NAME="uploads"
# Maximum file upload limit
FILE_SIZE_LIMIT=5242880
# GPT settings
OPENAI_API_BASE="https://api.openai.com/v1" # deprecated
OPENAI_API_KEY="sk-" # deprecated
GPT_ENGINE="gpt-3.5-turbo" # deprecated
# Settings related to Docker
DOCKERIZED=1 # Deprecated
# Github
GITHUB_CLIENT_SECRET="" # For fetching release notes
DOCKERIZED=1 # deprecated
# set to 1 If using the pre-configured minio setup
USE_MINIO=1
# Nginx Configuration
NGINX_PORT=80
# SignUps
ENABLE_SIGNUP="1"
# Email Redirection URL
# Email redirections and minio domain settings
WEB_URL="http://localhost"
# Gunicorn Workers
GUNICORN_WORKERS=2
# Base URLs
ADMIN_BASE_URL=
SPACE_BASE_URL=
APP_BASE_URL=
SECRET_KEY="gxoytl7dmnc1y37zahah820z5iq3iozu38cnfjtu3yaau9cd9z"
```
## Updates
- The environment variable NEXT_PUBLIC_API_BASE_URL has been removed from both the web and space projects.
- The naming convention for containers and images has been updated.
- The plane-worker image will no longer be maintained, as it has been merged with plane-backend.
- The Tiptap pro-extension dependency has been removed, eliminating the need for Tiptap API keys.

View File

@@ -10,6 +10,7 @@ import { IFormattedInstanceConfiguration, TInstanceGithubAuthenticationConfigura
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
// components
import {
CodeBlock,
ConfirmDiscardModal,
ControllerInput,
CopyField,
@@ -102,7 +103,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
url: originURL,
description: (
<>
We will auto-generate this. Paste this into the Authorized origin URL field{" "}
We will auto-generate this. Paste this into the{" "}
<CodeBlock darkerShade>Authorized origin URL</CodeBlock> field{" "}
<a
tabIndex={-1}
href="https://github.com/settings/applications/new"
@@ -121,7 +123,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
url: `${originURL}/auth/github/callback/`,
description: (
<>
We will auto-generate this. Paste this into your Authorized Callback URI field{" "}
We will auto-generate this. Paste this into your{" "}
<CodeBlock darkerShade>Authorized Callback URI</CodeBlock> field{" "}
<a
tabIndex={-1}
href="https://github.com/settings/applications/new"
@@ -143,8 +146,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
.then((response = []) => {
setToast({
type: TOAST_TYPE.SUCCESS,
title: "Success",
message: "Github Configuration Settings updated successfully",
title: "Done!",
message: "Your GitHub authentication is configured. You should test it now.",
});
reset({
GITHUB_CLIENT_ID: response.find((item) => item.key === "GITHUB_CLIENT_ID")?.value,
@@ -170,8 +173,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
/>
<div className="flex flex-col gap-8">
<div className="grid grid-cols-2 gap-x-12 gap-y-8 w-full">
<div className="flex flex-col gap-y-4 col-span-2 md:col-span-1">
<div className="pt-2 text-xl font-medium">Configuration</div>
<div className="flex flex-col gap-y-4 col-span-2 md:col-span-1 pt-1">
<div className="pt-2.5 text-xl font-medium">GitHub-provided details for Plane</div>
{GITHUB_FORM_FIELDS.map((field) => (
<ControllerInput
key={field.key}
@@ -201,8 +204,8 @@ export const InstanceGithubConfigForm: FC<Props> = (props) => {
</div>
</div>
<div className="col-span-2 md:col-span-1">
<div className="flex flex-col gap-y-4 px-6 py-4 my-2 bg-custom-background-80/60 rounded-lg">
<div className="pt-2 text-xl font-medium">Service provider details</div>
<div className="flex flex-col gap-y-4 px-6 pt-1.5 pb-4 bg-custom-background-80/60 rounded-lg">
<div className="pt-2 text-xl font-medium">Plane-provided details for GitHub</div>
{GITHUB_SERVICE_FIELD.map((field) => (
<CopyField key={field.key} label={field.label} url={field.url} description={field.description} />
))}

View File

@@ -93,7 +93,7 @@ const InstanceGithubAuthenticationPage = observer(() => {
withBorder={false}
/>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md p-4">
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
{formattedConfig ? (
<InstanceGithubConfigForm config={formattedConfig} />
) : (

View File

@@ -8,6 +8,7 @@ import { IFormattedInstanceConfiguration, TInstanceGitlabAuthenticationConfigura
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
// components
import {
CodeBlock,
ConfirmDiscardModal,
ControllerInput,
CopyField,
@@ -54,7 +55,7 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
label: "Host",
description: (
<>
This is the <b>GitLab host</b> to use for login, <b>including scheme</b>.
This is either https://gitlab.com or the <CodeBlock>domain.tld</CodeBlock> where you host GitLab.
</>
),
placeholder: "https://gitlab.com",
@@ -116,7 +117,8 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
url: `${originURL}/auth/gitlab/callback/`,
description: (
<>
We will auto-generate this. Paste this into the <b>Redirect URI</b> field of your{" "}
We will auto-generate this. Paste this into the{" "}
<CodeBlock darkerShade>Redirect URI</CodeBlock> field of your{" "}
<a
tabIndex={-1}
href="https://docs.gitlab.com/ee/integration/oauth_provider.html"
@@ -139,8 +141,8 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
.then((response = []) => {
setToast({
type: TOAST_TYPE.SUCCESS,
title: "Success",
message: "GitLab Configuration Settings updated successfully",
title: "Done!",
message: "Your GitLab authentication is configured. You should test it now.",
});
reset({
GITLAB_HOST: response.find((item) => item.key === "GITLAB_HOST")?.value,
@@ -167,8 +169,8 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
/>
<div className="flex flex-col gap-8">
<div className="grid grid-cols-2 gap-x-12 gap-y-8 w-full">
<div className="flex flex-col gap-y-4 col-span-2 md:col-span-1">
<div className="pt-2 text-xl font-medium">Configuration</div>
<div className="flex flex-col gap-y-4 col-span-2 md:col-span-1 pt-1">
<div className="pt-2.5 text-xl font-medium">GitLab-provided details for Plane</div>
{GITLAB_FORM_FIELDS.map((field) => (
<ControllerInput
key={field.key}
@@ -198,8 +200,8 @@ export const InstanceGitlabConfigForm: FC<Props> = (props) => {
</div>
</div>
<div className="col-span-2 md:col-span-1">
<div className="flex flex-col gap-y-4 px-6 py-4 my-2 bg-custom-background-80/60 rounded-lg">
<div className="pt-2 text-xl font-medium">Service provider details</div>
<div className="flex flex-col gap-y-4 px-6 pt-1.5 pb-4 bg-custom-background-80/60 rounded-lg">
<div className="pt-2 text-xl font-medium">Plane-provided details for GitLab</div>
{GITLAB_SERVICE_FIELD.map((field) => (
<CopyField key={field.key} label={field.label} url={field.url} description={field.description} />
))}

View File

@@ -80,7 +80,7 @@ const InstanceGitlabAuthenticationPage = observer(() => {
withBorder={false}
/>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md p-4">
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
{formattedConfig ? (
<InstanceGitlabConfigForm config={formattedConfig} />
) : (

View File

@@ -9,6 +9,7 @@ import { IFormattedInstanceConfiguration, TInstanceGoogleAuthenticationConfigura
import { Button, TOAST_TYPE, getButtonStyling, setToast } from "@plane/ui";
// components
import {
CodeBlock,
ConfirmDiscardModal,
ControllerInput,
CopyField,
@@ -101,7 +102,8 @@ export const InstanceGoogleConfigForm: FC<Props> = (props) => {
url: originURL,
description: (
<p>
We will auto-generate this. Paste this into your Authorized JavaScript origins field. For this OAuth client{" "}
We will auto-generate this. Paste this into your{" "}
<CodeBlock darkerShade>Authorized JavaScript origins</CodeBlock> field. For this OAuth client{" "}
<a
href="https://console.cloud.google.com/apis/credentials/oauthclient"
target="_blank"
@@ -119,7 +121,8 @@ export const InstanceGoogleConfigForm: FC<Props> = (props) => {
url: `${originURL}/auth/google/callback/`,
description: (
<p>
We will auto-generate this. Paste this into your Authorized Redirect URI field. For this OAuth client{" "}
We will auto-generate this. Paste this into your <CodeBlock darkerShade>Authorized Redirect URI</CodeBlock>{" "}
field. For this OAuth client{" "}
<a
href="https://console.cloud.google.com/apis/credentials/oauthclient"
target="_blank"
@@ -140,8 +143,8 @@ export const InstanceGoogleConfigForm: FC<Props> = (props) => {
.then((response = []) => {
setToast({
type: TOAST_TYPE.SUCCESS,
title: "Success",
message: "Google Configuration Settings updated successfully",
title: "Done!",
message: "Your Google authentication is configured. You should test it now.",
});
reset({
GOOGLE_CLIENT_ID: response.find((item) => item.key === "GOOGLE_CLIENT_ID")?.value,
@@ -167,8 +170,8 @@ export const InstanceGoogleConfigForm: FC<Props> = (props) => {
/>
<div className="flex flex-col gap-8">
<div className="grid grid-cols-2 gap-x-12 gap-y-8 w-full">
<div className="flex flex-col gap-y-4 col-span-2 md:col-span-1">
<div className="pt-2 text-xl font-medium">Configuration</div>
<div className="flex flex-col gap-y-4 col-span-2 md:col-span-1 pt-1">
<div className="pt-2.5 text-xl font-medium">Google-provided details for Plane</div>
{GOOGLE_FORM_FIELDS.map((field) => (
<ControllerInput
key={field.key}
@@ -198,8 +201,8 @@ export const InstanceGoogleConfigForm: FC<Props> = (props) => {
</div>
</div>
<div className="col-span-2 md:col-span-1">
<div className="flex flex-col gap-y-4 px-6 py-4 my-2 bg-custom-background-80/60 rounded-lg">
<div className="pt-2 text-xl font-medium">Service provider details</div>
<div className="flex flex-col gap-y-4 px-6 pt-1.5 pb-4 bg-custom-background-80/60 rounded-lg">
<div className="pt-2 text-xl font-medium">Plane-provided details for Google</div>
{GOOGLE_SERVICE_DETAILS.map((field) => (
<CopyField key={field.key} label={field.label} url={field.url} description={field.description} />
))}

View File

@@ -81,7 +81,7 @@ const InstanceGoogleAuthenticationPage = observer(() => {
withBorder={false}
/>
</div>
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md p-4">
<div className="flex-grow overflow-hidden overflow-y-scroll vertical-scrollbar scrollbar-md px-4">
{formattedConfig ? (
<InstanceGoogleConfigForm config={formattedConfig} />
) : (

View File

@@ -1,84 +1,48 @@
import { observer } from "mobx-react";
import Image from "next/image";
import { useTheme } from "next-themes";
import { KeyRound, Mails } from "lucide-react";
// types
import { TInstanceAuthenticationMethodKeys, TInstanceAuthenticationModes } from "@plane/types";
// components
import {
AuthenticationMethodCard,
EmailCodesConfiguration,
GithubConfiguration,
GitlabConfiguration,
GoogleConfiguration,
PasswordLoginConfiguration,
} from "@/components/authentication";
TGetBaseAuthenticationModeProps,
TInstanceAuthenticationMethodKeys,
TInstanceAuthenticationModes,
} from "@plane/types";
// components
import { AuthenticationMethodCard } from "@/components/authentication";
// helpers
import { resolveGeneralTheme } from "@/helpers/common.helper";
import { UpgradeButton } from "@/components/common/upgrade-button";
import { getBaseAuthenticationModes } from "@/helpers/authentication.helper";
// images
import githubLightModeImage from "@/public/logos/github-black.png";
import githubDarkModeImage from "@/public/logos/github-white.png";
import GitlabLogo from "@/public/logos/gitlab-logo.svg";
import GoogleLogo from "@/public/logos/google-logo.svg";
import OIDCLogo from "@/public/logos/oidc-logo.svg";
import SAMLLogo from "@/public/logos/saml-logo.svg";
export type TAuthenticationModeProps = {
disabled: boolean;
updateConfig: (key: TInstanceAuthenticationMethodKeys, value: string) => void;
};
export type TGetAuthenticationModeProps = {
disabled: boolean;
updateConfig: (key: TInstanceAuthenticationMethodKeys, value: string) => void;
resolvedTheme: string | undefined;
};
// Authentication methods
export const getAuthenticationModes: (props: TGetAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
export const getAuthenticationModes: (props: TGetBaseAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
disabled,
updateConfig,
resolvedTheme,
}) => [
...getBaseAuthenticationModes({ disabled, updateConfig, resolvedTheme }),
{
key: "unique-codes",
name: "Unique codes",
description: "Log in or sign up for Plane using codes sent via email. You need to have set up SMTP to use this method.",
icon: <Mails className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <EmailCodesConfiguration disabled={disabled} updateConfig={updateConfig} />,
key: "oidc",
name: "OIDC",
description: "Authenticate your users via the OpenID Connect protocol.",
icon: <Image src={OIDCLogo} height={22} width={22} alt="OIDC Logo" />,
config: <UpgradeButton />,
unavailable: true,
},
{
key: "passwords-login",
name: "Passwords",
description: "Allow members to create accounts with passwords and use it with their email addresses to sign in.",
icon: <KeyRound className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <PasswordLoginConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "google",
name: "Google",
description: "Allow members to log in or sign up for Plane with their Google accounts.",
icon: <Image src={GoogleLogo} height={20} width={20} alt="Google Logo" />,
config: <GoogleConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "github",
name: "GitHub",
description: "Allow members to log in or sign up for Plane with their GitHub accounts.",
icon: (
<Image
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
height={20}
width={20}
alt="GitHub Logo"
/>
),
config: <GithubConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "gitlab",
name: "GitLab",
description: "Allow members to login or sign up to plane with their GitLab accounts.",
icon: <Image src={GitlabLogo} height={20} width={20} alt="GitLab Logo" />,
config: <GitlabConfiguration disabled={disabled} updateConfig={updateConfig} />,
key: "saml",
name: "SAML",
description: "Authenticate your users via the Security Assertion Markup Language protocol.",
icon: <Image src={SAMLLogo} height={22} width={22} alt="SAML Logo" className="pl-0.5" />,
config: <UpgradeButton />,
unavailable: true,
},
];
@@ -97,6 +61,7 @@ export const AuthenticationModes: React.FC<TAuthenticationModeProps> = observer(
icon={method.icon}
config={method.config}
disabled={disabled}
unavailable={method.unavailable}
/>
))}
</>

View File

@@ -41,10 +41,10 @@ export const InstanceSidebar: FC<IInstanceSidebar> = observer(() => {
<div
className={`inset-y-0 z-20 flex h-full flex-shrink-0 flex-grow-0 flex-col border-r border-custom-sidebar-border-200 bg-custom-sidebar-background-100 duration-300
fixed md:relative
${isSidebarCollapsed ? "-ml-[280px]" : ""}
sm:${isSidebarCollapsed ? "-ml-[280px]" : ""}
md:ml-0 ${isSidebarCollapsed ? "w-[70px]" : "w-[280px]"}
lg:ml-0 ${isSidebarCollapsed ? "w-[70px]" : "w-[280px]"}
${isSidebarCollapsed ? "-ml-[290px]" : ""}
sm:${isSidebarCollapsed ? "-ml-[290px]" : ""}
md:ml-0 ${isSidebarCollapsed ? "w-[70px]" : "w-[290px]"}
lg:ml-0 ${isSidebarCollapsed ? "w-[70px]" : "w-[290px]"}
`}
>
<div ref={ref} className="flex h-full w-full flex-1 flex-col">

View File

@@ -11,10 +11,11 @@ type Props = {
config: JSX.Element;
disabled?: boolean;
withBorder?: boolean;
unavailable?: boolean;
};
export const AuthenticationMethodCard: FC<Props> = (props) => {
const { name, description, icon, config, disabled = false, withBorder = true } = props;
const { name, description, icon, config, disabled = false, withBorder = true, unavailable = false } = props;
return (
<div
@@ -22,7 +23,11 @@ export const AuthenticationMethodCard: FC<Props> = (props) => {
"px-4 py-3 border border-custom-border-200": withBorder,
})}
>
<div className="flex grow items-center gap-4">
<div
className={cn("flex grow items-center gap-4", {
"opacity-50": unavailable,
})}
>
<div className="shrink-0">
<div className="flex h-10 w-10 items-center justify-center rounded-full bg-custom-background-80">{icon}</div>
</div>

View File

@@ -0,0 +1,21 @@
import { cn } from "@/helpers/common.helper";
type TProps = {
children: React.ReactNode;
className?: string;
darkerShade?: boolean;
};
export const CodeBlock = ({ children, className, darkerShade }: TProps) => (
<span
className={cn(
"px-0.5 text-xs text-custom-text-300 bg-custom-background-90 font-semibold rounded-md border border-custom-border-100",
{
"text-custom-text-200 bg-custom-background-80 border-custom-border-200": darkerShade,
},
className
)}
>
{children}
</span>
);

View File

@@ -38,7 +38,7 @@ export const ControllerInput: React.FC<Props> = (props) => {
return (
<div className="flex flex-col gap-1">
<h4 className="text-sm text-custom-text-300">
{label} {!required && "(optional)"}
{label}
</h4>
<div className="relative">
<Controller
@@ -80,7 +80,7 @@ export const ControllerInput: React.FC<Props> = (props) => {
</button>
))}
</div>
{description && <p className="text-xs text-custom-text-300">{description}</p>}
{description && <p className="pt-0.5 text-xs text-custom-text-300">{description}</p>}
</div>
);
};

View File

@@ -7,3 +7,5 @@ export * from "./banner";
export * from "./empty-state";
export * from "./logo-spinner";
export * from "./page-header";
export * from "./code-block";
export * from "./upgrade-button";

View File

@@ -0,0 +1,16 @@
"use client";
import React from "react";
// icons
import { SquareArrowOutUpRight } from "lucide-react";
// ui
import { getButtonStyling } from "@plane/ui";
// helpers
import { cn } from "@/helpers/common.helper";
export const UpgradeButton: React.FC = () => (
<a href="https://plane.so/one" target="_blank" className={cn(getButtonStyling("primary", "sm"))}>
Available on One
<SquareArrowOutUpRight className="h-3.5 w-3.5 p-0.5" />
</a>
);

View File

@@ -1,7 +1,24 @@
import { ReactNode } from "react";
import Image from "next/image";
import Link from "next/link";
import { KeyRound, Mails } from "lucide-react";
// types
import { TGetBaseAuthenticationModeProps, TInstanceAuthenticationModes } from "@plane/types";
// components
import {
EmailCodesConfiguration,
GithubConfiguration,
GitlabConfiguration,
GoogleConfiguration,
PasswordLoginConfiguration,
} from "@/components/authentication";
// helpers
import { SUPPORT_EMAIL } from "./common.helper";
import { SUPPORT_EMAIL, resolveGeneralTheme } from "@/helpers/common.helper";
// images
import githubLightModeImage from "@/public/logos/github-black.png";
import githubDarkModeImage from "@/public/logos/github-white.png";
import GitlabLogo from "@/public/logos/gitlab-logo.svg";
import GoogleLogo from "@/public/logos/google-logo.svg";
export enum EPageTypes {
PUBLIC = "PUBLIC",
@@ -134,3 +151,53 @@ export const authErrorHandler = (
return undefined;
};
export const getBaseAuthenticationModes: (props: TGetBaseAuthenticationModeProps) => TInstanceAuthenticationModes[] = ({
disabled,
updateConfig,
resolvedTheme,
}) => [
{
key: "unique-codes",
name: "Unique codes",
description:
"Log in or sign up for Plane using codes sent via email. You need to have set up SMTP to use this method.",
icon: <Mails className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <EmailCodesConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "passwords-login",
name: "Passwords",
description: "Allow members to create accounts with passwords and use it with their email addresses to sign in.",
icon: <KeyRound className="h-6 w-6 p-0.5 text-custom-text-300/80" />,
config: <PasswordLoginConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "google",
name: "Google",
description: "Allow members to log in or sign up for Plane with their Google accounts.",
icon: <Image src={GoogleLogo} height={20} width={20} alt="Google Logo" />,
config: <GoogleConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "github",
name: "GitHub",
description: "Allow members to log in or sign up for Plane with their GitHub accounts.",
icon: (
<Image
src={resolveGeneralTheme(resolvedTheme) === "dark" ? githubDarkModeImage : githubLightModeImage}
height={20}
width={20}
alt="GitHub Logo"
/>
),
config: <GithubConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
{
key: "gitlab",
name: "GitLab",
description: "Allow members to log in or sign up to plane with their GitLab accounts.",
icon: <Image src={GitlabLogo} height={20} width={20} alt="GitLab Logo" />,
config: <GitlabConfiguration disabled={disabled} updateConfig={updateConfig} />,
},
];

View File

@@ -1,6 +1,6 @@
{
"name": "admin",
"version": "0.21.0",
"version": "0.22.0",
"private": true,
"scripts": {
"dev": "turbo run develop",
@@ -14,6 +14,7 @@
"@headlessui/react": "^1.7.19",
"@plane/types": "*",
"@plane/ui": "*",
"@plane/constants": "*",
"@tailwindcss/typography": "^0.5.9",
"@types/lodash": "^4.17.0",
"autoprefixer": "10.4.14",
@@ -46,4 +47,4 @@
"tsconfig": "*",
"typescript": "^5.4.2"
}
}
}

View File

@@ -0,0 +1,11 @@
<svg width="92" height="84" viewBox="0 0 92 84" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0_3695_11896)">
<path d="M83.0398 32.6876C74.2901 27.2397 62.0735 23.8553 48.7013 23.8553C21.7918 23.8553 0 37.3101 0 53.9016C0 69.0898 18.1598 81.554 41.685 83.7001V74.9504C25.8364 72.9693 13.95 64.3022 13.95 53.9016C13.95 42.0977 29.4684 32.44 48.7013 32.44C58.2765 32.44 66.9436 34.8338 73.217 38.7134L64.3022 44.2439H92.1197V27.0746L83.0398 32.6876Z" fill="#CCCCCC"/>
<path d="M41.6846 8.99736V74.9504V83.7002L55.6346 74.9504V0L41.6846 8.99736Z" fill="#FF6200"/>
</g>
<defs>
<clipPath id="clip0_3695_11896">
<rect width="92" height="84" fill="white"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 702 B

View File

@@ -0,0 +1,17 @@
<svg width="700" height="650" viewBox="0 0 700 650" fill="none" xmlns="http://www.w3.org/2000/svg">
<g clip-path="url(#clip0_3262_5767)">
<mask id="mask0_3262_5767" style="mask-type:luminance" maskUnits="userSpaceOnUse" x="0" y="0" width="700" height="650">
<path d="M700 0H0V650H700V0Z" fill="white"/>
</mask>
<g mask="url(#mask0_3262_5767)">
<path d="M337.682 0L360.832 20.5C377.982 35.7 395.182 50.85 412.132 66.25C521.982 166 614.982 278.25 684.982 407.45C688.582 414.05 691.832 420.85 695.082 427.6L699.982 437.75L694.582 440.6L690.532 434.85L680.032 419.9L672.682 409.2C621.732 335.25 570.682 261.2 500.582 201.95C479.373 183.995 455.969 168.807 430.932 156.75C380.232 132.5 335.132 142.2 296.432 182C259.632 219.85 240.532 266.85 223.282 314.65C221.032 320.8 218.682 326.9 216.332 333L212.232 343.75L203.632 341C208.632 323.6 213.232 306.1 217.832 288.55C228.332 248.8 238.832 209.05 253.432 170.75C268.932 129.95 288.532 90.6 308.082 51.25C316.532 34.2 324.982 17.15 333.082 0H337.682Z" fill="#C22E33"/>
<path d="M372.382 491.1C291.082 529.6 94.3829 569.3 1.08287 559.1C-14.1671 478.8 135.482 102.5 208.982 45.5L204.232 56.4C202.115 61.531 199.813 66.5842 197.332 71.55L194.032 78C156.032 151.1 118.082 224.3 98.6329 304.5C91.6287 332.124 87.8038 360.458 87.2328 388.95C86.7328 455.95 128.432 501.55 198.082 504.4C231.582 505.75 265.432 502.25 299.232 498.7C313.932 497.2 328.582 495.65 343.232 494.5C348.632 494.1 353.932 493.45 360.832 492.55L372.382 491.15V491.1Z" fill="#C22E33"/>
<path d="M141.233 639.05C118.983 640.75 96.733 642.45 74.583 644.45C279.433 663.95 476.083 630.6 670.083 562.25C606.833 450.75 521.583 362.7 422.483 286.15C423.783 291.05 426.683 294.6 429.533 298.1L431.933 301.1C440.433 312.4 449.333 323.5 458.283 334.6C478.733 360.05 499.183 385.5 514.583 413.5C553.483 484.5 532.383 545.9 456.183 578.3C406.083 599.65 351.333 614.2 297.183 622.9C245.683 631.1 193.433 635.05 141.233 639.05Z" fill="#C22E33"/>
</g>
</g>
<defs>
<clipPath id="clip0_3262_5767">
<rect width="700" height="650" fill="white"/>
</clipPath>
</defs>
</svg>

After

Width:  |  Height:  |  Size: 2.0 KiB

View File

@@ -1,4 +1,4 @@
{
"name": "plane-api",
"version": "0.21.0"
"version": "0.22.0"
}

View File

@@ -13,9 +13,9 @@ class CycleSerializer(BaseSerializer):
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
total_estimates = serializers.IntegerField(read_only=True)
completed_estimates = serializers.IntegerField(read_only=True)
started_estimates = serializers.IntegerField(read_only=True)
total_estimates = serializers.FloatField(read_only=True)
completed_estimates = serializers.FloatField(read_only=True)
started_estimates = serializers.FloatField(read_only=True)
def validate(self, data):
if (

View File

@@ -131,7 +131,10 @@ class IssueSerializer(BaseSerializer):
workspace_id = self.context["workspace_id"]
default_assignee_id = self.context["default_assignee_id"]
issue = Issue.objects.create(**validated_data, project_id=project_id)
issue = Issue.objects.create(
**validated_data,
project_id=project_id,
)
# Issue Audit Users
created_by_id = issue.created_by_id
@@ -312,10 +315,14 @@ class IssueLinkSerializer(BaseSerializer):
return IssueLink.objects.create(**validated_data)
def update(self, instance, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
).exclude(pk=instance.id).exists():
if (
IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
)
.exclude(pk=instance.id)
.exists()
):
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)

View File

@@ -12,7 +12,7 @@ from django.db.models import (
OuterRef,
Q,
Sum,
IntegerField,
FloatField,
)
from django.db.models.functions import Cast
@@ -867,12 +867,12 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
.values("display_name", "assignee_id", "avatar")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -882,7 +882,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -921,12 +921,12 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
.values("label_name", "color", "label_id")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -936,7 +936,7 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -1140,12 +1140,38 @@ class TransferCycleIssueAPIEndpoint(BaseAPIView):
)
updated_cycles = []
update_cycle_issue_activity = []
for cycle_issue in cycle_issues:
cycle_issue.cycle_id = new_cycle_id
updated_cycles.append(cycle_issue)
update_cycle_issue_activity.append(
{
"old_cycle_id": str(cycle_id),
"new_cycle_id": str(new_cycle_id),
"issue_id": str(cycle_issue.issue_id),
}
)
cycle_issues = CycleIssue.objects.bulk_update(
updated_cycles, ["cycle_id"], batch_size=100
)
# Capture Issue Activity
issue_activity.delay(
type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": []}),
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{
"updated_cycle_issues": update_cycle_issue_activity,
"created_cycle_issues": "[]",
}
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response({"message": "Success"}, status=status.HTTP_200_OK)

View File

@@ -149,7 +149,7 @@ class InboxIssueAPIEndpoint(BaseAPIView):
description_html=request.data.get("issue", {}).get(
"description_html", "<p></p>"
),
priority=request.data.get("issue", {}).get("priority", "low"),
priority=request.data.get("issue", {}).get("priority", "none"),
project_id=project_id,
state=state,
)

View File

@@ -19,7 +19,7 @@ from plane.app.permissions import ProjectBasePermission
from plane.db.models import (
Cycle,
Inbox,
IssueProperty,
IssueUserProperty,
Module,
Project,
DeployBoard,
@@ -165,7 +165,7 @@ class ProjectAPIEndpoint(BaseAPIView):
role=20,
)
# Also create the issue property for the user
_ = IssueProperty.objects.create(
_ = IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user=request.user,
)
@@ -179,7 +179,7 @@ class ProjectAPIEndpoint(BaseAPIView):
role=20,
)
# Also create the issue property for the user
IssueProperty.objects.create(
IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user_id=serializer.data["project_lead"],
)
@@ -240,6 +240,7 @@ class ProjectAPIEndpoint(BaseAPIView):
.filter(pk=serializer.data["id"])
.first()
)
# Model activity
model_activity.delay(
model_name="project",

View File

@@ -50,7 +50,7 @@ from .issue import (
IssueCreateSerializer,
IssueActivitySerializer,
IssueCommentSerializer,
IssuePropertySerializer,
IssueUserPropertySerializer,
IssueAssigneeSerializer,
LabelSerializer,
IssueSerializer,
@@ -91,6 +91,7 @@ from .page import (
PageLogSerializer,
SubPageSerializer,
PageDetailSerializer,
PageVersionSerializer,
)
from .estimate import (

View File

@@ -17,7 +17,7 @@ from plane.db.models import (
Issue,
IssueActivity,
IssueComment,
IssueProperty,
IssueUserProperty,
IssueAssignee,
IssueSubscriber,
IssueLabel,
@@ -135,7 +135,11 @@ class IssueCreateSerializer(BaseSerializer):
workspace_id = self.context["workspace_id"]
default_assignee_id = self.context["default_assignee_id"]
issue = Issue.objects.create(**validated_data, project_id=project_id)
# Create Issue
issue = Issue.objects.create(
**validated_data,
project_id=project_id,
)
# Issue Audit Users
created_by_id = issue.created_by_id
@@ -248,9 +252,9 @@ class IssueActivitySerializer(BaseSerializer):
fields = "__all__"
class IssuePropertySerializer(BaseSerializer):
class IssueUserPropertySerializer(BaseSerializer):
class Meta:
model = IssueProperty
model = IssueUserProperty
fields = "__all__"
read_only_fields = [
"user",
@@ -459,10 +463,14 @@ class IssueLinkSerializer(BaseSerializer):
return IssueLink.objects.create(**validated_data)
def update(self, instance, validated_data):
if IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
).exclude(pk=instance.id).exists():
if (
IssueLink.objects.filter(
url=validated_data.get("url"),
issue_id=instance.issue_id,
)
.exclude(pk=instance.id)
.exists()
):
raise serializers.ValidationError(
{"error": "URL already exists for this Issue"}
)
@@ -509,7 +517,7 @@ class IssueAttachmentLiteSerializer(DynamicBaseSerializer):
"attributes",
"issue_id",
"updated_at",
"updated_by_id",
"updated_by",
]
read_only_fields = fields

View File

@@ -177,8 +177,8 @@ class ModuleSerializer(DynamicBaseSerializer):
started_issues = serializers.IntegerField(read_only=True)
unstarted_issues = serializers.IntegerField(read_only=True)
backlog_issues = serializers.IntegerField(read_only=True)
total_estimate_points = serializers.IntegerField(read_only=True)
completed_estimate_points = serializers.IntegerField(read_only=True)
total_estimate_points = serializers.FloatField(read_only=True)
completed_estimate_points = serializers.FloatField(read_only=True)
class Meta:
model = Module
@@ -222,10 +222,10 @@ class ModuleSerializer(DynamicBaseSerializer):
class ModuleDetailSerializer(ModuleSerializer):
link_module = ModuleLinkSerializer(read_only=True, many=True)
sub_issues = serializers.IntegerField(read_only=True)
backlog_estimate_points = serializers.IntegerField(read_only=True)
unstarted_estimate_points = serializers.IntegerField(read_only=True)
started_estimate_points = serializers.IntegerField(read_only=True)
cancelled_estimate_points = serializers.IntegerField(read_only=True)
backlog_estimate_points = serializers.FloatField(read_only=True)
unstarted_estimate_points = serializers.FloatField(read_only=True)
started_estimate_points = serializers.FloatField(read_only=True)
cancelled_estimate_points = serializers.FloatField(read_only=True)
class Meta(ModuleSerializer.Meta):
fields = ModuleSerializer.Meta.fields + ["link_module", "sub_issues", "backlog_estimate_points", "unstarted_estimate_points", "started_estimate_points", "cancelled_estimate_points"]

View File

@@ -3,11 +3,16 @@ from .base import BaseSerializer
from .user import UserLiteSerializer
from plane.db.models import Notification, UserNotificationPreference
# Third Party imports
from rest_framework import serializers
class NotificationSerializer(BaseSerializer):
triggered_by_details = UserLiteSerializer(
read_only=True, source="triggered_by"
)
is_inbox_issue = serializers.BooleanField(read_only=True)
is_mentioned_notification = serializers.BooleanField(read_only=True)
class Meta:
model = Notification

View File

@@ -10,6 +10,7 @@ from plane.db.models import (
Label,
ProjectPage,
Project,
PageVersion,
)
@@ -161,3 +162,13 @@ class PageLogSerializer(BaseSerializer):
"workspace",
"page",
]
class PageVersionSerializer(BaseSerializer):
class Meta:
model = PageVersion
fields = "__all__"
read_only_fields = [
"workspace",
"page",
]

View File

@@ -23,7 +23,7 @@ class IssueViewSerializer(DynamicBaseSerializer):
]
def create(self, validated_data):
query_params = validated_data.get("query_data", {})
query_params = validated_data.get("filters", {})
if bool(query_params):
validated_data["query"] = issue_filters(query_params, "POST")
else:
@@ -31,7 +31,7 @@ class IssueViewSerializer(DynamicBaseSerializer):
return IssueView.objects.create(**validated_data)
def update(self, instance, validated_data):
query_params = validated_data.get("query_data", {})
query_params = validated_data.get("filters", {})
if bool(query_params):
validated_data["query"] = issue_filters(query_params, "POST")
else:

View File

@@ -15,6 +15,7 @@ from plane.db.models import (
WorkspaceTheme,
WorkspaceUserProperties,
)
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
class WorkSpaceSerializer(DynamicBaseSerializer):
@@ -22,22 +23,11 @@ class WorkSpaceSerializer(DynamicBaseSerializer):
total_members = serializers.IntegerField(read_only=True)
total_issues = serializers.IntegerField(read_only=True)
def validated(self, data):
if data.get("slug") in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
"password",
]:
raise serializers.ValidationError({"slug": "Slug is not valid"})
def validate_slug(self, value):
# Check if the slug is restricted
if value in RESTRICTED_WORKSPACE_SLUGS:
raise serializers.ValidationError("Slug is not valid")
return value
class Meta:
model = Workspace

View File

@@ -233,13 +233,13 @@ urlpatterns = [
name="project-issue-comment-reactions",
),
## End Comment Reactions
## IssueProperty
## IssueUserProperty
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/user-properties/",
IssueUserDisplayPropertyEndpoint.as_view(),
name="project-issue-display-properties",
),
## IssueProperty End
## IssueUserProperty End
## Issue Archives
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/archived-issues/",

View File

@@ -7,6 +7,7 @@ from plane.app.views import (
PageLogEndpoint,
SubPagesEndpoint,
PagesDescriptionViewSet,
PageVersionEndpoint,
)
@@ -65,6 +66,16 @@ urlpatterns = [
),
name="project-pages-lock-unlock",
),
# private and public page
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/access/",
PageViewSet.as_view(
{
"post": "access",
}
),
name="project-pages-access",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:pk>/transactions/",
PageLogEndpoint.as_view(),
@@ -90,4 +101,14 @@ urlpatterns = [
),
name="page-description",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/versions/",
PageVersionEndpoint.as_view(),
name="page-versions",
),
path(
"workspaces/<str:slug>/projects/<uuid:project_id>/pages/<uuid:page_id>/versions/<uuid:pk>/",
PageVersionEndpoint.as_view(),
name="page-versions",
),
]

View File

@@ -179,6 +179,7 @@ from .page.base import (
SubPagesEndpoint,
PagesDescriptionViewSet,
)
from .page.version import PageVersionEndpoint
from .search.base import GlobalSearchEndpoint
from .search.issue import IssueSearchEndpoint

View File

@@ -19,7 +19,7 @@ from django.db.models import (
When,
Subquery,
Sum,
IntegerField,
FloatField,
)
from django.db.models.functions import Coalesce, Cast
from django.utils import timezone
@@ -86,7 +86,7 @@ class CycleViewSet(BaseViewSet):
.values("issue_cycle__cycle_id")
.annotate(
backlog_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("backlog_estimate_point")[:1]
@@ -100,7 +100,7 @@ class CycleViewSet(BaseViewSet):
.values("issue_cycle__cycle_id")
.annotate(
unstarted_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("unstarted_estimate_point")[:1]
@@ -114,7 +114,7 @@ class CycleViewSet(BaseViewSet):
.values("issue_cycle__cycle_id")
.annotate(
started_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("started_estimate_point")[:1]
@@ -128,7 +128,7 @@ class CycleViewSet(BaseViewSet):
.values("issue_cycle__cycle_id")
.annotate(
cancelled_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("cancelled_estimate_point")[:1]
@@ -142,7 +142,7 @@ class CycleViewSet(BaseViewSet):
.values("issue_cycle__cycle_id")
.annotate(
completed_estimate_points=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("completed_estimate_points")[:1]
@@ -155,7 +155,7 @@ class CycleViewSet(BaseViewSet):
.values("issue_cycle__cycle_id")
.annotate(
total_estimate_points=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("total_estimate_points")[:1]
@@ -287,37 +287,37 @@ class CycleViewSet(BaseViewSet):
.annotate(
backlog_estimate_points=Coalesce(
Subquery(backlog_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
unstarted_estimate_points=Coalesce(
Subquery(unstarted_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
started_estimate_points=Coalesce(
Subquery(started_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
cancelled_estimate_points=Coalesce(
Subquery(cancelled_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
completed_estimate_points=Coalesce(
Subquery(completed_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
total_estimate_points=Coalesce(
Subquery(total_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.order_by("-is_favorite", "name")
@@ -395,12 +395,12 @@ class CycleViewSet(BaseViewSet):
.values("display_name", "assignee_id", "avatar")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -410,7 +410,7 @@ class CycleViewSet(BaseViewSet):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -433,12 +433,12 @@ class CycleViewSet(BaseViewSet):
.values("label_name", "color", "label_id")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -448,7 +448,7 @@ class CycleViewSet(BaseViewSet):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -465,14 +465,14 @@ class CycleViewSet(BaseViewSet):
}
if data[0]["start_date"] and data[0]["end_date"]:
data[0]["estimate_distribution"]["completion_chart"] = (
burndown_plot(
queryset=queryset.first(),
slug=slug,
project_id=project_id,
plot_type="points",
cycle_id=data[0]["id"],
)
data[0]["estimate_distribution"][
"completion_chart"
] = burndown_plot(
queryset=queryset.first(),
slug=slug,
project_id=project_id,
plot_type="points",
cycle_id=data[0]["id"],
)
assignee_distribution = (
@@ -645,6 +645,8 @@ class CycleViewSet(BaseViewSet):
"progress_snapshot",
"logo_props",
# meta fields
"completed_estimate_points",
"total_estimate_points",
"is_favorite",
"cancelled_issues",
"total_issues",
@@ -654,6 +656,7 @@ class CycleViewSet(BaseViewSet):
"backlog_issues",
"assignee_ids",
"status",
"created_by",
)
.first()
)
@@ -739,6 +742,8 @@ class CycleViewSet(BaseViewSet):
"progress_snapshot",
"logo_props",
# meta fields
"completed_estimate_points",
"total_estimate_points",
"is_favorite",
"total_issues",
"cancelled_issues",
@@ -748,6 +753,7 @@ class CycleViewSet(BaseViewSet):
"backlog_issues",
"assignee_ids",
"status",
"created_by",
).first()
# Send the model activity
@@ -800,6 +806,8 @@ class CycleViewSet(BaseViewSet):
"sub_issues",
"logo_props",
# meta fields
"completed_estimate_points",
"total_estimate_points",
"is_favorite",
"total_issues",
"cancelled_issues",
@@ -836,12 +844,12 @@ class CycleViewSet(BaseViewSet):
.values("display_name", "assignee_id", "avatar")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -851,7 +859,7 @@ class CycleViewSet(BaseViewSet):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -874,12 +882,12 @@ class CycleViewSet(BaseViewSet):
.values("label_name", "color", "label_id")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -889,7 +897,7 @@ class CycleViewSet(BaseViewSet):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -1234,12 +1242,12 @@ class TransferCycleIssueEndpoint(BaseAPIView):
.values("display_name", "assignee_id", "avatar")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -1249,7 +1257,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -1288,12 +1296,12 @@ class TransferCycleIssueEndpoint(BaseAPIView):
.values("label_name", "color", "label_id")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -1303,7 +1311,7 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -1507,14 +1515,40 @@ class TransferCycleIssueEndpoint(BaseAPIView):
)
updated_cycles = []
update_cycle_issue_activity = []
for cycle_issue in cycle_issues:
cycle_issue.cycle_id = new_cycle_id
updated_cycles.append(cycle_issue)
update_cycle_issue_activity.append(
{
"old_cycle_id": str(cycle_id),
"new_cycle_id": str(new_cycle_id),
"issue_id": str(cycle_issue.issue_id),
}
)
cycle_issues = CycleIssue.objects.bulk_update(
updated_cycles, ["cycle_id"], batch_size=100
)
# Capture Issue Activity
issue_activity.delay(
type="cycle.activity.created",
requested_data=json.dumps({"cycles_list": []}),
actor_id=str(self.request.user.id),
issue_id=None,
project_id=str(self.kwargs.get("project_id", None)),
current_instance=json.dumps(
{
"updated_cycle_issues": update_cycle_issue_activity,
"created_cycle_issues": "[]",
}
),
epoch=int(timezone.now().timestamp()),
notification=True,
origin=request.META.get("HTTP_ORIGIN"),
)
return Response({"message": "Success"}, status=status.HTTP_200_OK)

View File

@@ -542,6 +542,7 @@ def dashboard_recent_collaborators(self, request, slug):
project__project_projectmember__member=request.user,
project__project_projectmember__is_active=True,
project__archived_at__isnull=True,
is_active=True,
)
.annotate(
num_activities=Coalesce(

View File

@@ -32,7 +32,7 @@ from plane.app.permissions import (
from plane.app.serializers import (
IssueCreateSerializer,
IssueDetailSerializer,
IssuePropertySerializer,
IssueUserPropertySerializer,
IssueSerializer,
)
from plane.bgtasks.issue_activites_task import issue_activity
@@ -40,7 +40,7 @@ from plane.db.models import (
Issue,
IssueAttachment,
IssueLink,
IssueProperty,
IssueUserProperty,
IssueReaction,
IssueSubscriber,
Project,
@@ -289,13 +289,9 @@ class IssueViewSet(BaseViewSet):
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
count_queryset=Issue.issue_objects.filter(
workspace__slug=slug,
project_id=project_id,
),
)
else:
@@ -304,6 +300,10 @@ class IssueViewSet(BaseViewSet):
request=request,
order_by=order_by_param,
queryset=issue_queryset,
count_queryset=Issue.issue_objects.filter(
workspace__slug=slug,
project_id=project_id,
),
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
@@ -317,20 +317,16 @@ class IssueViewSet(BaseViewSet):
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
count_queryset=Issue.issue_objects.filter(
workspace__slug=slug,
project_id=project_id,
),
on_results=lambda issues: issue_on_results(
group_by=group_by, issues=issues, sub_group_by=sub_group_by
),
@@ -481,7 +477,38 @@ class IssueViewSet(BaseViewSet):
return Response(serializer.data, status=status.HTTP_200_OK)
def partial_update(self, request, slug, project_id, pk=None):
issue = self.get_queryset().filter(pk=pk).first()
issue = (
self.get_queryset()
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.filter(pk=pk)
.first()
)
if not issue:
return Response(
@@ -539,7 +566,7 @@ class IssueUserDisplayPropertyEndpoint(BaseAPIView):
]
def patch(self, request, slug, project_id):
issue_property = IssueProperty.objects.get(
issue_property = IssueUserProperty.objects.get(
user=request.user,
project_id=project_id,
)
@@ -554,14 +581,14 @@ class IssueUserDisplayPropertyEndpoint(BaseAPIView):
"display_properties", issue_property.display_properties
)
issue_property.save()
serializer = IssuePropertySerializer(issue_property)
serializer = IssueUserPropertySerializer(issue_property)
return Response(serializer.data, status=status.HTTP_201_CREATED)
def get(self, request, slug, project_id):
issue_property, _ = IssueProperty.objects.get_or_create(
issue_property, _ = IssueUserProperty.objects.get_or_create(
user=request.user, project_id=project_id
)
serializer = IssuePropertySerializer(issue_property)
serializer = IssueUserPropertySerializer(issue_property)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -3,8 +3,11 @@ import json
# Django imports
from django.utils import timezone
from django.db.models import Q
from django.db.models import Q, OuterRef, F, Func, UUIDField, Value, CharField
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models.functions import Coalesce
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
# Third Party imports
from rest_framework.response import Response
@@ -20,6 +23,9 @@ from plane.app.permissions import ProjectEntityPermission
from plane.db.models import (
Project,
IssueRelation,
Issue,
IssueAttachment,
IssueLink,
)
from plane.bgtasks.issue_activites_task import issue_activity
@@ -61,56 +67,149 @@ class IssueRelationViewSet(BaseViewSet):
.order_by("-created_at")
.distinct()
)
# get all blocking issues
blocking_issues = issue_relations.filter(
relation_type="blocked_by", related_issue_id=issue_id
)
).values_list("issue_id", flat=True)
# get all blocked by issues
blocked_by_issues = issue_relations.filter(
relation_type="blocked_by", issue_id=issue_id
)
).values_list("related_issue_id", flat=True)
# get all duplicate issues
duplicate_issues = issue_relations.filter(
issue_id=issue_id, relation_type="duplicate"
)
).values_list("related_issue_id", flat=True)
# get all relates to issues
duplicate_issues_related = issue_relations.filter(
related_issue_id=issue_id, relation_type="duplicate"
)
).values_list("issue_id", flat=True)
# get all relates to issues
relates_to_issues = issue_relations.filter(
issue_id=issue_id, relation_type="relates_to"
)
).values_list("related_issue_id", flat=True)
# get all relates to issues
relates_to_issues_related = issue_relations.filter(
related_issue_id=issue_id, relation_type="relates_to"
)
).values_list("issue_id", flat=True)
blocked_by_issues_serialized = IssueRelationSerializer(
blocked_by_issues, many=True
).data
duplicate_issues_serialized = IssueRelationSerializer(
duplicate_issues, many=True
).data
relates_to_issues_serialized = IssueRelationSerializer(
relates_to_issues, many=True
).data
queryset = (
Issue.issue_objects.filter(
workspace__slug=slug,
project_id=project_id,
)
.filter(workspace__slug=self.kwargs.get("slug"))
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
).distinct()
# revere relation for blocked by issues
blocking_issues_serialized = RelatedIssueSerializer(
blocking_issues, many=True
).data
# reverse relation for duplicate issues
duplicate_issues_related_serialized = RelatedIssueSerializer(
duplicate_issues_related, many=True
).data
# reverse relation for related issues
relates_to_issues_related_serialized = RelatedIssueSerializer(
relates_to_issues_related, many=True
).data
# Fields
fields = [
"id",
"name",
"state_id",
"sort_order",
"priority",
"sequence_id",
"project_id",
"label_ids",
"assignee_ids",
"created_at",
"updated_at",
"created_by",
"updated_by",
"relation_type",
]
response_data = {
"blocking": blocking_issues_serialized,
"blocked_by": blocked_by_issues_serialized,
"duplicate": duplicate_issues_serialized
+ duplicate_issues_related_serialized,
"relates_to": relates_to_issues_serialized
+ relates_to_issues_related_serialized,
"blocking": queryset.filter(pk__in=blocking_issues)
.annotate(
relation_type=Value("blocking", output_field=CharField())
)
.values(*fields),
"blocked_by": queryset.filter(pk__in=blocked_by_issues)
.annotate(
relation_type=Value("blocked_by", output_field=CharField())
)
.values(*fields),
"duplicate": queryset.filter(pk__in=duplicate_issues)
.annotate(
relation_type=Value(
"duplicate",
output_field=CharField(),
)
)
.values(*fields)
| queryset.filter(pk__in=duplicate_issues_related)
.annotate(
relation_type=Value(
"duplicate",
output_field=CharField(),
)
)
.values(*fields),
"relates_to": queryset.filter(pk__in=relates_to_issues)
.annotate(
relation_type=Value(
"relates_to",
output_field=CharField(),
)
)
.values(*fields)
| queryset.filter(pk__in=relates_to_issues_related)
.annotate(
relation_type=Value(
"relates_to",
output_field=CharField(),
)
)
.values(*fields),
}
return Response(response_data, status=status.HTTP_200_OK)

View File

@@ -99,6 +99,7 @@ class SubIssuesEndpoint(BaseAPIView):
),
)
.annotate(state_group=F("state__group"))
.order_by("-created_at")
)
# create's a dict with state group name with their respective issue id's

View File

@@ -17,6 +17,7 @@ from django.db.models import (
UUIDField,
Value,
Sum,
FloatField,
)
from django.db.models.functions import Coalesce, Cast
from django.core.serializers.json import DjangoJSONEncoder
@@ -138,7 +139,7 @@ class ModuleViewSet(BaseViewSet):
.values("issue_module__module_id")
.annotate(
completed_estimate_points=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("completed_estimate_points")[:1]
@@ -152,7 +153,7 @@ class ModuleViewSet(BaseViewSet):
.values("issue_module__module_id")
.annotate(
total_estimate_points=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("total_estimate_points")[:1]
@@ -166,7 +167,7 @@ class ModuleViewSet(BaseViewSet):
.values("issue_module__module_id")
.annotate(
backlog_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("backlog_estimate_point")[:1]
@@ -180,7 +181,7 @@ class ModuleViewSet(BaseViewSet):
.values("issue_module__module_id")
.annotate(
unstarted_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("unstarted_estimate_point")[:1]
@@ -194,7 +195,7 @@ class ModuleViewSet(BaseViewSet):
.values("issue_module__module_id")
.annotate(
started_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("started_estimate_point")[:1]
@@ -208,7 +209,7 @@ class ModuleViewSet(BaseViewSet):
.values("issue_module__module_id")
.annotate(
cancelled_estimate_point=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
)
)
.values("cancelled_estimate_point")[:1]
@@ -270,37 +271,37 @@ class ModuleViewSet(BaseViewSet):
.annotate(
backlog_estimate_points=Coalesce(
Subquery(backlog_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
unstarted_estimate_points=Coalesce(
Subquery(unstarted_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
started_estimate_points=Coalesce(
Subquery(started_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
cancelled_estimate_points=Coalesce(
Subquery(cancelled_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
completed_estimate_points=Coalesce(
Subquery(completed_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
total_estimate_points=Coalesce(
Subquery(total_estimate_point),
Value(0, output_field=IntegerField()),
Value(0, output_field=FloatField()),
),
)
.annotate(
@@ -475,12 +476,12 @@ class ModuleViewSet(BaseViewSet):
)
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
),
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -490,7 +491,7 @@ class ModuleViewSet(BaseViewSet):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,
@@ -513,12 +514,12 @@ class ModuleViewSet(BaseViewSet):
.values("label_name", "color", "label_id")
.annotate(
total_estimates=Sum(
Cast("estimate_point__value", IntegerField())
Cast("estimate_point__value", FloatField())
),
)
.annotate(
completed_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=False,
archived_at__isnull=True,
@@ -528,7 +529,7 @@ class ModuleViewSet(BaseViewSet):
)
.annotate(
pending_estimates=Sum(
Cast("estimate_point__value", IntegerField()),
Cast("estimate_point__value", FloatField()),
filter=Q(
completed_at__isnull=True,
archived_at__isnull=True,

View File

@@ -1,5 +1,5 @@
# Django imports
from django.db.models import Exists, OuterRef, Q
from django.db.models import Exists, OuterRef, Q, Case, When, BooleanField
from django.utils import timezone
# Third party imports
@@ -45,12 +45,28 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
archived = request.GET.get("archived", "false")
read = request.GET.get("read", None)
type = request.GET.get("type", "all")
mentioned = request.GET.get("mentioned", False)
q_filters = Q()
inbox_issue = Issue.objects.filter(
pk=OuterRef("entity_identifier"),
issue_inbox__status__in=[0, 2, -2],
workspace__slug=self.kwargs.get("slug"),
)
notifications = (
Notification.objects.filter(
workspace__slug=slug, receiver_id=request.user.id
)
.filter(entity_name="issue")
.annotate(is_inbox_issue=Exists(inbox_issue))
.annotate(
is_mentioned_notification=Case(
When(sender__icontains="mentioned", then=True),
default=False,
output_field=BooleanField(),
)
)
.select_related("workspace", "project", "triggered_by", "receiver")
.order_by("snoozed_till", "-created_at")
)
@@ -78,6 +94,13 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
if read == "true":
notifications = notifications.filter(read_at__isnull=False)
if mentioned:
notifications = notifications.filter(sender__icontains="mentioned")
else:
notifications = notifications.exclude(
sender__icontains="mentioned"
)
type = type.split(",")
# Subscribed issues
if "subscribed" in type:
@@ -202,46 +225,35 @@ class NotificationViewSet(BaseViewSet, BasePaginator):
class UnreadNotificationEndpoint(BaseAPIView):
def get(self, request, slug):
# Watching Issues Count
subscribed_issues_count = Notification.objects.filter(
workspace__slug=slug,
receiver_id=request.user.id,
read_at__isnull=True,
archived_at__isnull=True,
snoozed_till__isnull=True,
entity_identifier__in=IssueSubscriber.objects.filter(
workspace__slug=slug, subscriber_id=request.user.id
).values_list("issue_id", flat=True),
).count()
unread_notifications_count = (
Notification.objects.filter(
workspace__slug=slug,
receiver_id=request.user.id,
read_at__isnull=True,
archived_at__isnull=True,
snoozed_till__isnull=True,
)
.exclude(sender__icontains="mentioned")
.count()
)
# My Issues Count
my_issues_count = Notification.objects.filter(
mention_notifications_count = Notification.objects.filter(
workspace__slug=slug,
receiver_id=request.user.id,
read_at__isnull=True,
archived_at__isnull=True,
snoozed_till__isnull=True,
entity_identifier__in=IssueAssignee.objects.filter(
workspace__slug=slug, assignee_id=request.user.id
).values_list("issue_id", flat=True),
).count()
# Created Issues Count
created_issues_count = Notification.objects.filter(
workspace__slug=slug,
receiver_id=request.user.id,
read_at__isnull=True,
archived_at__isnull=True,
snoozed_till__isnull=True,
entity_identifier__in=Issue.objects.filter(
workspace__slug=slug, created_by=request.user
).values_list("pk", flat=True),
sender__icontains="mentioned",
).count()
return Response(
{
"subscribed_issues": subscribed_issues_count,
"my_issues": my_issues_count,
"created_issues": created_issues_count,
"total_unread_notifications_count": int(
unread_notifications_count
),
"mention_unread_notifications_count": int(
mention_notifications_count
),
},
status=status.HTTP_200_OK,
)

View File

@@ -38,6 +38,7 @@ from plane.db.models import (
from ..base import BaseAPIView, BaseViewSet
from plane.bgtasks.page_transaction_task import page_transaction
from plane.bgtasks.page_version_task import page_version
def unarchive_archive_page_and_descendants(page_id, archived_at):
@@ -215,8 +216,14 @@ class PageViewSet(BaseViewSet):
status=status.HTTP_404_NOT_FOUND,
)
else:
issue_ids = PageLog.objects.filter(
page_id=pk, entity_name="issue"
).values_list("entity_identifier", flat=True)
data = PageDetailSerializer(page).data
data["issue_ids"] = issue_ids
return Response(
PageDetailSerializer(page).data, status=status.HTTP_200_OK
data,
status=status.HTTP_200_OK,
)
def lock(self, request, slug, project_id, pk):
@@ -238,6 +245,28 @@ class PageViewSet(BaseViewSet):
return Response(status=status.HTTP_204_NO_CONTENT)
def access(self, request, slug, project_id, pk):
access = request.data.get("access", 0)
page = Page.objects.filter(
pk=pk, workspace__slug=slug, projects__id=project_id
).first()
# Only update access if the page owner is the requesting user
if (
page.access != request.data.get("access", page.access)
and page.owned_by_id != request.user.id
):
return Response(
{
"error": "Access cannot be updated since this page is owned by someone else"
},
status=status.HTTP_400_BAD_REQUEST,
)
page.access = access
page.save()
return Response(status=status.HTTP_204_NO_CONTENT)
def list(self, request, slug, project_id):
queryset = self.get_queryset()
pages = PageSerializer(queryset, many=True).data
@@ -463,6 +492,12 @@ class PagesDescriptionViewSet(BaseViewSet):
.first()
)
if page is None:
return Response(
{"error": "Page not found"},
status=404,
)
if page.is_locked:
return Response(
{"error": "Page is locked"},
@@ -475,16 +510,38 @@ class PagesDescriptionViewSet(BaseViewSet):
status=472,
)
# Serialize the existing instance
existing_instance = json.dumps(
{
"description_html": page.description_html,
},
cls=DjangoJSONEncoder,
)
# Get the base64 data from the request
base64_data = request.data.get("description_binary")
# If base64 data is provided
if base64_data:
# Decode the base64 data to bytes
new_binary_data = base64.b64decode(base64_data)
# capture the page transaction
if request.data.get("description_html"):
page_transaction.delay(
new_value=request.data,
old_value=existing_instance,
page_id=pk,
)
# Store the updated binary data
page.description_binary = new_binary_data
page.description_html = request.data.get("description_html")
page.save()
# Return a success response
page_version.delay(
page_id=page.id,
existing_instance=existing_instance,
user_id=request.user.id,
)
return Response({"message": "Updated successfully"})
else:
return Response({"error": "No binary data provided"})

View File

@@ -0,0 +1,37 @@
# Third party imports
from rest_framework import status
from rest_framework.response import Response
# Module imports
from plane.db.models import PageVersion
from ..base import BaseAPIView
from plane.app.permissions import ProjectEntityPermission
from plane.app.serializers import PageVersionSerializer
class PageVersionEndpoint(BaseAPIView):
permission_classes = [
ProjectEntityPermission,
]
def get(self, request, slug, project_id, page_id, pk=None):
# Check if pk is provided
if pk:
# Return a single page version
page_version = PageVersion.objects.get(
workspace__slug=slug,
page_id=page_id,
pk=pk,
)
# Serialize the page version
serializer = PageVersionSerializer(page_version)
return Response(serializer.data, status=status.HTTP_200_OK)
# Return all page versions
page_versions = PageVersion.objects.filter(
workspace__slug=slug,
page_id=page_id,
)
# Serialize the page versions
serializer = PageVersionSerializer(page_versions, many=True)
return Response(serializer.data, status=status.HTTP_200_OK)

View File

@@ -39,7 +39,7 @@ from plane.db.models import (
Cycle,
Inbox,
DeployBoard,
IssueProperty,
IssueUserProperty,
Issue,
Module,
Project,
@@ -266,7 +266,7 @@ class ProjectViewSet(BaseViewSet):
role=20,
)
# Also create the issue property for the user
_ = IssueProperty.objects.create(
_ = IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user=request.user,
)
@@ -280,7 +280,7 @@ class ProjectViewSet(BaseViewSet):
role=20,
)
# Also create the issue property for the user
IssueProperty.objects.create(
IssueUserProperty.objects.create(
project_id=serializer.data["id"],
user_id=serializer.data["project_lead"],
)
@@ -342,6 +342,7 @@ class ProjectViewSet(BaseViewSet):
.first()
)
# Create the model activity
model_activity.delay(
model_name="project",
model_id=str(project.id),

View File

@@ -25,7 +25,7 @@ from plane.db.models import (
ProjectMemberInvite,
User,
WorkspaceMember,
IssueProperty,
IssueUserProperty,
)
@@ -179,9 +179,9 @@ class UserProjectInvitationsViewset(BaseViewSet):
ignore_conflicts=True,
)
IssueProperty.objects.bulk_create(
IssueUserProperty.objects.bulk_create(
[
IssueProperty(
IssueUserProperty(
project_id=project_id,
user=request.user,
workspace=workspace,

View File

@@ -22,7 +22,7 @@ from plane.db.models import (
ProjectMember,
Workspace,
TeamMember,
IssueProperty,
IssueUserProperty,
)
from plane.bgtasks.project_add_user_email_task import project_add_user_email
from plane.utils.host import base_host
@@ -136,7 +136,7 @@ class ProjectMemberViewSet(BaseViewSet):
)
# Create a new issue property
bulk_issue_props.append(
IssueProperty(
IssueUserProperty(
user_id=member.get("member_id"),
project_id=project_id,
workspace_id=project.workspace_id,
@@ -150,7 +150,7 @@ class ProjectMemberViewSet(BaseViewSet):
ignore_conflicts=True,
)
_ = IssueProperty.objects.bulk_create(
_ = IssueUserProperty.objects.bulk_create(
bulk_issue_props, batch_size=10, ignore_conflicts=True
)
@@ -323,7 +323,7 @@ class AddTeamToProjectEndpoint(BaseAPIView):
)
)
issue_props.append(
IssueProperty(
IssueUserProperty(
project_id=project_id,
user_id=member,
workspace=workspace,
@@ -335,7 +335,7 @@ class AddTeamToProjectEndpoint(BaseAPIView):
project_members, batch_size=10, ignore_conflicts=True
)
_ = IssueProperty.objects.bulk_create(
_ = IssueUserProperty.objects.bulk_create(
issue_props, batch_size=10, ignore_conflicts=True
)

View File

@@ -1,5 +1,4 @@
# Python imports
import re
# Django imports
from django.db.models import Q
@@ -11,13 +10,7 @@ from rest_framework.response import Response
# Module imports
from .base import BaseAPIView
from plane.db.models import (
Workspace,
Project,
Issue,
Cycle,
Module,
Page,
IssueView,
)
from plane.utils.issue_search import search_issues

View File

@@ -37,6 +37,9 @@ from plane.utils.paginator import BasePaginator
from plane.authentication.utils.host import user_ip
from plane.bgtasks.user_deactivation_email_task import user_deactivation_email
from plane.utils.host import base_host
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control
from django.views.decorators.vary import vary_on_cookie
class UserEndpoint(BaseViewSet):
@@ -47,6 +50,8 @@ class UserEndpoint(BaseViewSet):
return self.request.user
@cache_response(60 * 60)
@method_decorator(cache_control(private=True, max_age=12))
@method_decorator(vary_on_cookie)
def retrieve(self, request):
serialized_data = UserMeSerializer(request.user).data
return Response(
@@ -55,6 +60,8 @@ class UserEndpoint(BaseViewSet):
)
@cache_response(60 * 60)
@method_decorator(cache_control(private=True, max_age=12))
@method_decorator(vary_on_cookie)
def retrieve_user_settings(self, request):
serialized_data = UserMeSettingsSerializer(request.user).data
return Response(serialized_data, status=status.HTTP_200_OK)
@@ -79,6 +86,9 @@ class UserEndpoint(BaseViewSet):
return super().partial_update(request, *args, **kwargs)
@invalidate_cache(path="/api/users/me/")
@invalidate_cache(
path="/api/users/me/workspaces/", multiple=True, user=False
)
def deactivate(self, request):
# Check all workspace user is active
user = self.get_object()
@@ -288,6 +298,8 @@ class AccountEndpoint(BaseAPIView):
class ProfileEndpoint(BaseAPIView):
@method_decorator(cache_control(private=True, max_age=12))
@method_decorator(vary_on_cookie)
def get(self, request):
profile = Profile.objects.get(user=request.user)
serializer = ProfileSerializer(profile)

View File

@@ -44,6 +44,10 @@ from plane.db.models import (
WorkspaceTheme,
)
from plane.utils.cache import cache_response, invalidate_cache
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control
from django.views.decorators.vary import vary_on_cookie
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
class WorkSpaceViewSet(BaseViewSet):
@@ -118,7 +122,7 @@ class WorkSpaceViewSet(BaseViewSet):
status=status.HTTP_400_BAD_REQUEST,
)
if serializer.is_valid():
if serializer.is_valid(raise_exception=True):
serializer.save(owner=request.user)
# Create Workspace member
_ = WorkspaceMember.objects.create(
@@ -171,6 +175,8 @@ class UserWorkSpacesEndpoint(BaseAPIView):
]
@cache_response(60 * 60 * 2)
@method_decorator(cache_control(private=True, max_age=12))
@method_decorator(vary_on_cookie)
def get(self, request):
fields = [
field
@@ -231,7 +237,10 @@ class WorkSpaceAvailabilityCheckEndpoint(BaseAPIView):
status=status.HTTP_400_BAD_REQUEST,
)
workspace = Workspace.objects.filter(slug=slug).exists()
workspace = (
Workspace.objects.filter(slug=slug).exists()
or slug in RESTRICTED_WORKSPACE_SLUGS
)
return Response({"status": not workspace}, status=status.HTTP_200_OK)

View File

@@ -39,6 +39,16 @@ class OauthAdapter(Adapter):
self.client_secret = client_secret
self.code = code
def authentication_error_code(self):
if self.provider == "google":
return "GOOGLE_OAUTH_PROVIDER_ERROR"
elif self.provider == "github":
return "GITHUB_OAUTH_PROVIDER_ERROR"
elif self.provider == "gitlab":
return "GITLAB_OAUTH_PROVIDER_ERROR"
else:
return "OAUTH_NOT_CONFIGURED"
def get_auth_url(self):
return self.auth_url
@@ -62,7 +72,7 @@ class OauthAdapter(Adapter):
response.raise_for_status()
return response.json()
except requests.RequestException:
code = self._provider_error_code()
code = self.authentication_error_code()
raise AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES[code],
error_message=str(code),
@@ -77,15 +87,7 @@ class OauthAdapter(Adapter):
response.raise_for_status()
return response.json()
except requests.RequestException:
if self.provider == "google":
code = "GOOGLE_OAUTH_PROVIDER_ERROR"
elif self.provider == "github":
code = "GITHUB_OAUTH_PROVIDER_ERROR"
elif self.provider == "gitlab":
code = "GITLAB_OAUTH_PROVIDER_ERROR"
else:
code = "OAUTH_NOT_CONFIGURED"
code = self.authentication_error_code()
raise AuthenticationException(
error_code=AUTHENTICATION_ERROR_CODES[code],
error_message=str(code),

View File

@@ -3,7 +3,7 @@ from plane.db.models import Profile, Workspace, WorkspaceMemberInvite
def get_redirection_path(user):
# Handle redirections
profile = Profile.objects.get(user=user)
profile, _ = Profile.objects.get_or_create(user=user)
# Redirect to onboarding if the user is not onboarded yet
if not profile.is_onboarded:

View File

@@ -120,7 +120,7 @@ class MagicSignInEndpoint(View):
callback=post_user_auth_workflow,
)
user = provider.authenticate()
profile = Profile.objects.get(user=user)
profile, _ = Profile.objects.get_or_create(user=user)
# Login the user and record his device info
user_login(request=request, user=user, is_app=True)
if user.is_password_autoset and profile.is_onboarded:

View File

@@ -1391,6 +1391,7 @@ def create_issue_relation_activity(
workspace_id=workspace_id,
comment=f"added {requested_data.get('relation_type')} relation",
old_identifier=related_issue,
epoch=epoch,
)
)
issue = Issue.objects.get(pk=issue_id)

View File

@@ -0,0 +1,44 @@
# Python imports
import json
# Third party imports
from celery import shared_task
# Module imports
from plane.db.models import Page, PageVersion
@shared_task
def page_version(
page_id,
existing_instance,
user_id,
):
# Get the page
page = Page.objects.get(id=page_id)
# Get the current instance
current_instance = (
json.loads(existing_instance) if existing_instance is not None else {}
)
# Create a version if description_html is updated
if current_instance.get("description_html") != page.description_html:
# Create a new page version
PageVersion.objects.create(
page_id=page_id,
workspace_id=page.workspace_id,
description_html=page.description_html,
description_binary=page.description_binary,
owned_by_id=user_id,
last_saved_at=page.updated_at,
)
# If page versions are greater than 20 delete the oldest one
if PageVersion.objects.filter(page_id=page_id).count() > 20:
# Delete the old page version
PageVersion.objects.filter(page_id=page_id).order_by(
"last_saved_at"
).first().delete()
return

View File

@@ -161,7 +161,7 @@ class Migration(migrations.Migration):
options={
"verbose_name": "Workspace User Property",
"verbose_name_plural": "Workspace User Property",
"db_table": "Workspace_user_properties",
"db_table": "workspace_user_properties",
"ordering": ("-created_at",),
"unique_together": {("workspace", "user")},
},

View File

@@ -0,0 +1,106 @@
# Generated by Django 4.2.11 on 2024-07-10 13:59
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
import uuid
class Migration(migrations.Migration):
dependencies = [
('db', '0069_alter_account_provider_and_more'),
]
operations = [
migrations.AddField(
model_name='apitoken',
name='is_service',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='exporterhistory',
name='filters',
field=models.JSONField(blank=True, null=True),
),
migrations.AddField(
model_name='exporterhistory',
name='name',
field=models.CharField(blank=True, max_length=255, null=True, verbose_name='Exporter Name'),
),
migrations.AddField(
model_name='exporterhistory',
name='type',
field=models.CharField(choices=[('issue_exports', 'Issue Exports'), ('issue_worklogs', 'Issue Worklogs')], default='issue_exports', max_length=50),
),
migrations.AddField(
model_name='project',
name='is_time_tracking_enabled',
field=models.BooleanField(default=False),
),
migrations.AddField(
model_name='project',
name='start_date',
field=models.DateTimeField(blank=True, null=True),
),
migrations.AddField(
model_name='project',
name='target_date',
field=models.DateTimeField(blank=True, null=True),
),
migrations.CreateModel(
name='PageVersion',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('last_saved_at', models.DateTimeField(default=django.utils.timezone.now)),
('description_binary', models.BinaryField(null=True)),
('description_html', models.TextField(blank=True, default='<p></p>')),
('description_stripped', models.TextField(blank=True, null=True)),
('description_json', models.JSONField(blank=True, default=dict)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('owned_by', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_versions', to=settings.AUTH_USER_MODEL)),
('page', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_versions', to='db.page')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='page_versions', to='db.workspace')),
],
options={
'verbose_name': 'Page Version',
'verbose_name_plural': 'Page Versions',
'db_table': 'page_versions',
'ordering': ('-created_at',),
},
),
migrations.CreateModel(
name='IssueType',
fields=[
('created_at', models.DateTimeField(auto_now_add=True, verbose_name='Created At')),
('updated_at', models.DateTimeField(auto_now=True, verbose_name='Last Modified At')),
('id', models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, primary_key=True, serialize=False, unique=True)),
('name', models.CharField(max_length=255)),
('description', models.TextField(blank=True)),
('logo_props', models.JSONField(default=dict)),
('sort_order', models.FloatField(default=65535)),
('is_default', models.BooleanField(default=True)),
('weight', models.PositiveIntegerField(default=0)),
('created_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_created_by', to=settings.AUTH_USER_MODEL, verbose_name='Created By')),
('project', models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='project_%(class)s', to='db.project')),
('updated_by', models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='%(class)s_updated_by', to=settings.AUTH_USER_MODEL, verbose_name='Last Modified By')),
('workspace', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='workspace_%(class)s', to='db.workspace')),
],
options={
'verbose_name': 'Issue Type',
'verbose_name_plural': 'Issue Types',
'db_table': 'issue_types',
'ordering': ('sort_order',),
'unique_together': {('project', 'name')},
},
),
migrations.AddField(
model_name='issue',
name='type',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='issue_type', to='db.issuetype'),
),
]

View File

@@ -0,0 +1,44 @@
# Generated by Django 4.2.11 on 2024-07-15 06:07
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("db", "0070_apitoken_is_service_exporterhistory_filters_and_more"),
]
operations = [
migrations.RenameModel(
old_name="IssueProperty",
new_name="IssueUserProperty",
),
migrations.AlterModelOptions(
name="issueuserproperty",
options={
"ordering": ("-created_at",),
"verbose_name": "Issue User Property",
"verbose_name_plural": "Issue User Properties",
},
),
migrations.AlterModelTable(
name="issueuserproperty",
table="issue_user_properties",
),
migrations.AddField(
model_name="issuetype",
name="is_active",
field=models.BooleanField(default=True),
),
migrations.AddField(
model_name="project",
name="is_issue_type_enabled",
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name="issuetype",
name="is_default",
field=models.BooleanField(default=False),
),
]

View File

@@ -29,7 +29,7 @@ from .issue import (
IssueLabel,
IssueLink,
IssueMention,
IssueProperty,
IssueUserProperty,
IssueReaction,
IssueRelation,
IssueSequence,
@@ -50,7 +50,14 @@ from .notification import (
Notification,
UserNotificationPreference,
)
from .page import Page, PageFavorite, PageLabel, PageLog, ProjectPage
from .page import (
Page,
PageFavorite,
PageLabel,
PageLog,
ProjectPage,
PageVersion,
)
from .project import (
Project,
ProjectBaseModel,
@@ -101,3 +108,5 @@ from .webhook import Webhook, WebhookLog
from .dashboard import Dashboard, DashboardWidget, Widget
from .favorite import UserFavorite
from .issue_type import IssueType

View File

@@ -44,6 +44,7 @@ class APIToken(BaseModel):
null=True,
)
expired_at = models.DateTimeField(blank=True, null=True)
is_service = models.BooleanField(default=False)
class Meta:
verbose_name = "API Token"

View File

@@ -18,6 +18,17 @@ def generate_token():
class ExporterHistory(BaseModel):
name = models.CharField(
max_length=255, verbose_name="Exporter Name", null=True, blank=True
)
type = models.CharField(
max_length=50,
default="issue_exports",
choices=(
("issue_exports", "Issue Exports"),
("issue_worklogs", "Issue Worklogs"),
),
)
workspace = models.ForeignKey(
"db.WorkSpace",
on_delete=models.CASCADE,
@@ -55,6 +66,7 @@ class ExporterHistory(BaseModel):
on_delete=models.CASCADE,
related_name="workspace_exporters",
)
filters = models.JSONField(blank=True, null=True)
class Meta:
verbose_name = "Exporter"

View File

@@ -6,7 +6,7 @@ from django.conf import settings
from django.contrib.postgres.fields import ArrayField
from django.core.exceptions import ValidationError
from django.core.validators import MaxValueValidator, MinValueValidator
from django.db import models
from django.db import models, transaction
from django.db.models.signals import post_save
from django.dispatch import receiver
from django.utils import timezone
@@ -91,6 +91,7 @@ class IssueManager(models.Manager):
| models.Q(issue_inbox__status=2)
| models.Q(issue_inbox__isnull=True)
)
.filter(state__is_triage=False)
.exclude(archived_at__isnull=False)
.exclude(project__archived_at__isnull=False)
.exclude(is_draft=True)
@@ -163,6 +164,13 @@ class Issue(ProjectBaseModel):
is_draft = models.BooleanField(default=False)
external_source = models.CharField(max_length=255, null=True, blank=True)
external_id = models.CharField(max_length=255, blank=True, null=True)
type = models.ForeignKey(
"db.IssueType",
on_delete=models.SET_NULL,
related_name="issue_type",
null=True,
blank=True,
)
objects = models.Manager()
issue_objects = IssueManager()
@@ -174,7 +182,6 @@ class Issue(ProjectBaseModel):
ordering = ("-created_at",)
def save(self, *args, **kwargs):
# This means that the model isn't saved to the database yet
if self.state is None:
try:
from plane.db.models import State
@@ -184,7 +191,6 @@ class Issue(ProjectBaseModel):
project=self.project,
default=True,
).first()
# if there is no default state assign any random state
if default_state is None:
random_state = State.objects.filter(
~models.Q(is_triage=True), project=self.project
@@ -198,7 +204,6 @@ class Issue(ProjectBaseModel):
try:
from plane.db.models import State
# Check if the current issue state group is completed or not
if self.state.group == "completed":
self.completed_at = timezone.now()
else:
@@ -207,30 +212,44 @@ class Issue(ProjectBaseModel):
pass
if self._state.adding:
# Get the maximum display_id value from the database
last_id = IssueSequence.objects.filter(
project=self.project
).aggregate(largest=models.Max("sequence"))["largest"]
# aggregate can return None! Check it first.
# If it isn't none, just use the last ID specified (which should be the greatest) and add one to it
if last_id:
self.sequence_id = last_id + 1
else:
self.sequence_id = 1
with transaction.atomic():
last_sequence = (
IssueSequence.objects.filter(project=self.project)
.select_for_update()
.aggregate(largest=models.Max("sequence"))["largest"]
)
self.sequence_id = last_sequence + 1 if last_sequence else 1
# Strip the html tags using html parser
self.description_stripped = (
None
if (
self.description_html == ""
or self.description_html is None
)
else strip_tags(self.description_html)
)
largest_sort_order = Issue.objects.filter(
project=self.project, state=self.state
).aggregate(largest=models.Max("sort_order"))["largest"]
if largest_sort_order is not None:
self.sort_order = largest_sort_order + 10000
largest_sort_order = Issue.objects.filter(
project=self.project, state=self.state
).aggregate(largest=models.Max("sort_order"))["largest"]
if largest_sort_order is not None:
self.sort_order = largest_sort_order + 10000
super(Issue, self).save(*args, **kwargs)
# Strip the html tags using html parser
self.description_stripped = (
None
if (self.description_html == "" or self.description_html is None)
else strip_tags(self.description_html)
)
super(Issue, self).save(*args, **kwargs)
IssueSequence.objects.create(
issue=self, sequence=self.sequence_id, project=self.project
)
else:
# Strip the html tags using html parser
self.description_stripped = (
None
if (
self.description_html == ""
or self.description_html is None
)
else strip_tags(self.description_html)
)
super(Issue, self).save(*args, **kwargs)
def __str__(self):
"""Return name of the issue"""
@@ -474,7 +493,7 @@ class IssueComment(ProjectBaseModel):
return str(self.issue)
class IssueProperty(ProjectBaseModel):
class IssueUserProperty(ProjectBaseModel):
user = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
@@ -487,9 +506,9 @@ class IssueProperty(ProjectBaseModel):
)
class Meta:
verbose_name = "Issue Property"
verbose_name_plural = "Issue Properties"
db_table = "issue_properties"
verbose_name = "Issue User Property"
verbose_name_plural = "Issue User Properties"
db_table = "issue_user_properties"
ordering = ("-created_at",)
unique_together = ["user", "project"]
@@ -667,14 +686,3 @@ class IssueVote(ProjectBaseModel):
def __str__(self):
return f"{self.issue.name} {self.actor.email}"
# TODO: Find a better method to save the model
@receiver(post_save, sender=Issue)
def create_issue_sequence(sender, instance, created, **kwargs):
if created:
IssueSequence.objects.create(
issue=instance,
sequence=instance.sequence_id,
project=instance.project,
)

View File

@@ -0,0 +1,37 @@
# Django imports
from django.db import models
# Module imports
from .workspace import WorkspaceBaseModel
class IssueType(WorkspaceBaseModel):
name = models.CharField(max_length=255)
description = models.TextField(blank=True)
logo_props = models.JSONField(default=dict)
sort_order = models.FloatField(default=65535)
is_default = models.BooleanField(default=False)
weight = models.PositiveIntegerField(default=0)
is_active = models.BooleanField(default=True)
class Meta:
unique_together = ["project", "name"]
verbose_name = "Issue Type"
verbose_name_plural = "Issue Types"
db_table = "issue_types"
ordering = ("sort_order",)
def __str__(self):
return self.name
def save(self, *args, **kwargs):
# If we are adding a new issue type, we need to set the sort order
if self._state.adding:
# Get the largest sort order for the project
largest_sort_order = IssueType.objects.filter(
project=self.project
).aggregate(largest=models.Max("sort_order"))["largest"]
# If there are issue types, set the sort order to the largest + 10000
if largest_sort_order is not None:
self.sort_order = largest_sort_order + 10000
super(IssueType, self).save(*args, **kwargs)

View File

@@ -1,6 +1,7 @@
import uuid
from django.conf import settings
from django.utils import timezone
# Django imports
from django.db import models
@@ -66,6 +67,15 @@ class Page(BaseModel):
"""Return owner email and page name"""
return f"{self.owned_by.email} <{self.name}>"
def save(self, *args, **kwargs):
# Strip the html tags using html parser
self.description_stripped = (
None
if (self.description_html == "" or self.description_html is None)
else strip_tags(self.description_html)
)
super(Page, self).save(*args, **kwargs)
class PageLog(BaseModel):
TYPE_CHOICES = (
@@ -93,7 +103,9 @@ class PageLog(BaseModel):
verbose_name="Transaction Type",
)
workspace = models.ForeignKey(
"db.Workspace", on_delete=models.CASCADE, related_name="workspace_page_log"
"db.Workspace",
on_delete=models.CASCADE,
related_name="workspace_page_log",
)
class Meta:
@@ -247,3 +259,41 @@ class TeamPage(BaseModel):
verbose_name_plural = "Team Pages"
db_table = "team_pages"
ordering = ("-created_at",)
class PageVersion(BaseModel):
workspace = models.ForeignKey(
"db.Workspace",
on_delete=models.CASCADE,
related_name="page_versions",
)
page = models.ForeignKey(
"db.Page",
on_delete=models.CASCADE,
related_name="page_versions",
)
last_saved_at = models.DateTimeField(default=timezone.now)
owned_by = models.ForeignKey(
settings.AUTH_USER_MODEL,
on_delete=models.CASCADE,
related_name="page_versions",
)
description_binary = models.BinaryField(null=True)
description_html = models.TextField(blank=True, default="<p></p>")
description_stripped = models.TextField(blank=True, null=True)
description_json = models.JSONField(default=dict, blank=True)
class Meta:
verbose_name = "Page Version"
verbose_name_plural = "Page Versions"
db_table = "page_versions"
ordering = ("-created_at",)
def save(self, *args, **kwargs):
# Strip the html tags using html parser
self.description_stripped = (
None
if (self.description_html == "" or self.description_html is None)
else strip_tags(self.description_html)
)
super(PageVersion, self).save(*args, **kwargs)

View File

@@ -94,6 +94,8 @@ class Project(BaseModel):
issue_views_view = models.BooleanField(default=True)
page_view = models.BooleanField(default=True)
inbox_view = models.BooleanField(default=False)
is_time_tracking_enabled = models.BooleanField(default=False)
is_issue_type_enabled = models.BooleanField(default=False)
cover_image = models.URLField(blank=True, null=True, max_length=800)
estimate = models.ForeignKey(
"db.Estimate",
@@ -115,6 +117,9 @@ class Project(BaseModel):
related_name="default_state",
)
archived_at = models.DateTimeField(null=True)
# Project start and target date
start_date = models.DateTimeField(null=True, blank=True)
target_date = models.DateTimeField(null=True, blank=True)
def __str__(self):
"""Return name of the project"""

View File

@@ -6,6 +6,7 @@ from django.db import models
from .base import BaseModel
from .project import ProjectBaseModel
from .workspace import WorkspaceBaseModel
from plane.utils.issue_filters import issue_filters
def get_default_filters():
@@ -116,6 +117,26 @@ class IssueView(WorkspaceBaseModel):
db_table = "issue_views"
ordering = ("-created_at",)
def save(self, *args, **kwargs):
query_params = self.filters
self.query = (
issue_filters(query_params, "POST") if query_params else {}
)
if self._state.adding:
if self.project:
largest_sort_order = IssueView.objects.filter(
project=self.project
).aggregate(largest=models.Max("sort_order"))["largest"]
else:
largest_sort_order = IssueView.objects.filter(
workspace=self.workspace, project__isnull=True
).aggregate(largest=models.Max("sort_order"))["largest"]
if largest_sort_order is not None:
self.sort_order = largest_sort_order + 10000
super(IssueView, self).save(*args, **kwargs)
def __str__(self):
"""Return name of the View"""
return f"{self.name} <{self.project.name}>"

View File

@@ -5,6 +5,7 @@ from django.db import models
# Module imports
from .base import BaseModel
from plane.utils.constants import RESTRICTED_WORKSPACE_SLUGS
ROLE_CHOICES = (
(20, "Owner"),
@@ -112,19 +113,7 @@ def get_issue_props():
def slug_validator(value):
if value in [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
]:
if value in RESTRICTED_WORKSPACE_SLUGS:
raise ValidationError("Slug is not valid")

View File

@@ -23,6 +23,8 @@ from plane.license.utils.instance_value import (
get_configuration_value,
)
from plane.utils.cache import cache_response, invalidate_cache
from django.utils.decorators import method_decorator
from django.views.decorators.cache import cache_control
class InstanceEndpoint(BaseAPIView):
@@ -36,6 +38,7 @@ class InstanceEndpoint(BaseAPIView):
]
@cache_response(60 * 60 * 2, user=False)
@method_decorator(cache_control(private=True, max_age=12))
def get(self, request):
instance = Instance.objects.first()

View File

@@ -16,6 +16,7 @@ from django.core.management.utils import get_random_secret_key
from sentry_sdk.integrations.celery import CeleryIntegration
from sentry_sdk.integrations.django import DjangoIntegration
from sentry_sdk.integrations.redis import RedisIntegration
from corsheaders.defaults import default_headers
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
@@ -127,6 +128,8 @@ else:
CORS_ALLOW_ALL_ORIGINS = True
secure_origins = False
CORS_ALLOW_HEADERS = [*default_headers, "X-API-Key"]
# Application Settings
WSGI_APPLICATION = "plane.wsgi.application"
ASGI_APPLICATION = "plane.asgi.application"
@@ -291,7 +294,7 @@ if bool(os.environ.get("SENTRY_DSN", False)) and os.environ.get(
send_default_pii=True,
environment=os.environ.get("SENTRY_ENVIRONMENT", "development"),
profiles_sample_rate=float(
os.environ.get("SENTRY_PROFILE_SAMPLE_RATE", 0.5)
os.environ.get("SENTRY_PROFILE_SAMPLE_RATE", 0)
),
)

View File

@@ -1,5 +1,9 @@
from .user import UserLiteSerializer
from .issue import LabelLiteSerializer, StateLiteSerializer
from .issue import (
LabelLiteSerializer,
StateLiteSerializer,
IssuePublicSerializer,
)
from .state import StateSerializer, StateLiteSerializer

View File

@@ -188,11 +188,16 @@ class IssueAttachmentSerializer(BaseSerializer):
class IssueReactionSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueReaction
fields = "__all__"
fields = [
"issue",
"reaction",
"workspace",
"project",
"actor",
]
read_only_fields = [
"workspace",
"project",
@@ -454,20 +459,6 @@ class IssueCreateSerializer(BaseSerializer):
return super().update(instance, validated_data)
class IssueReactionSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueReaction
fields = "__all__"
read_only_fields = [
"workspace",
"project",
"issue",
"actor",
]
class CommentReactionSerializer(BaseSerializer):
class Meta:
model = CommentReaction
@@ -476,7 +467,6 @@ class CommentReactionSerializer(BaseSerializer):
class IssueVoteSerializer(BaseSerializer):
actor_detail = UserLiteSerializer(read_only=True, source="actor")
class Meta:
model = IssueVote
@@ -486,35 +476,45 @@ class IssueVoteSerializer(BaseSerializer):
"workspace",
"project",
"actor",
"actor_detail",
]
read_only_fields = fields
class IssuePublicSerializer(BaseSerializer):
project_detail = ProjectLiteSerializer(read_only=True, source="project")
state_detail = StateLiteSerializer(read_only=True, source="state")
reactions = IssueReactionSerializer(
read_only=True, many=True, source="issue_reactions"
)
votes = IssueVoteSerializer(read_only=True, many=True)
module_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
label_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
assignee_ids = serializers.ListField(
child=serializers.UUIDField(),
required=False,
)
class Meta:
model = Issue
fields = [
"id",
"name",
"description_html",
"sequence_id",
"state",
"state_detail",
"project",
"project_detail",
"workspace",
"priority",
"target_date",
"reactions",
"votes",
"module_ids",
"created_by",
"label_ids",
"assignee_ids",
]
read_only_fields = fields

View File

@@ -5,6 +5,11 @@ from plane.space.views import (
ProjectDeployBoardPublicSettingsEndpoint,
ProjectIssuesPublicEndpoint,
WorkspaceProjectAnchorEndpoint,
ProjectCyclesEndpoint,
ProjectModulesEndpoint,
ProjectStatesEndpoint,
ProjectLabelsEndpoint,
ProjectMembersEndpoint,
)
urlpatterns = [
@@ -23,4 +28,29 @@ urlpatterns = [
WorkspaceProjectAnchorEndpoint.as_view(),
name="project-deploy-board",
),
path(
"anchor/<str:anchor>/cycles/",
ProjectCyclesEndpoint.as_view(),
name="project-cycles",
),
path(
"anchor/<str:anchor>/modules/",
ProjectModulesEndpoint.as_view(),
name="project-modules",
),
path(
"anchor/<str:anchor>/states/",
ProjectStatesEndpoint.as_view(),
name="project-states",
),
path(
"anchor/<str:anchor>/labels/",
ProjectLabelsEndpoint.as_view(),
name="project-labels",
),
path(
"anchor/<str:anchor>/members/",
ProjectMembersEndpoint.as_view(),
name="project-members",
),
]

View File

@@ -0,0 +1,248 @@
# Django imports
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models import Q, UUIDField, Value, F, Case, When, JSONField
from django.db.models.functions import Coalesce, JSONObject
# Module imports
from plane.db.models import (
Cycle,
Issue,
Label,
Module,
Project,
ProjectMember,
State,
WorkspaceMember,
)
def issue_queryset_grouper(queryset, group_by, sub_group_by):
FIELD_MAPPER = {
"label_ids": "labels__id",
"assignee_ids": "assignees__id",
"module_ids": "issue_module__module_id",
}
annotations_map = {
"assignee_ids": ("assignees__id", ~Q(assignees__id__isnull=True)),
"label_ids": ("labels__id", ~Q(labels__id__isnull=True)),
"module_ids": (
"issue_module__module_id",
~Q(issue_module__module_id__isnull=True),
),
}
default_annotations = {
key: Coalesce(
ArrayAgg(
field,
distinct=True,
filter=condition,
),
Value([], output_field=ArrayField(UUIDField())),
)
for key, (field, condition) in annotations_map.items()
if FIELD_MAPPER.get(key) != group_by
or FIELD_MAPPER.get(key) != sub_group_by
}
return queryset.annotate(**default_annotations)
def issue_on_results(issues, group_by, sub_group_by):
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"issue_module__module_id": "module_ids",
}
original_list = ["assignee_ids", "label_ids", "module_ids"]
required_fields = [
"id",
"name",
"state_id",
"sort_order",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"created_by",
"state__group",
]
if group_by in FIELD_MAPPER:
original_list.remove(FIELD_MAPPER[group_by])
original_list.append(group_by)
if sub_group_by in FIELD_MAPPER:
original_list.remove(FIELD_MAPPER[sub_group_by])
original_list.append(sub_group_by)
required_fields.extend(original_list)
issues = issues.annotate(
vote_items=ArrayAgg(
Case(
When(
votes__isnull=False,
then=JSONObject(
vote=F("votes__vote"),
actor_details=JSONObject(
id=F("votes__actor__id"),
first_name=F("votes__actor__first_name"),
last_name=F("votes__actor__last_name"),
avatar=F("votes__actor__avatar"),
display_name=F("votes__actor__display_name"),
)
),
),
default=None,
output_field=JSONField(),
),
filter=Case(
When(votes__isnull=False, then=True),
default=False,
output_field=JSONField(),
),
distinct=True,
),
reaction_items=ArrayAgg(
Case(
When(
issue_reactions__isnull=False,
then=JSONObject(
reaction=F("issue_reactions__reaction"),
actor_details=JSONObject(
id=F("issue_reactions__actor__id"),
first_name=F("issue_reactions__actor__first_name"),
last_name=F("issue_reactions__actor__last_name"),
avatar=F("issue_reactions__actor__avatar"),
display_name=F("issue_reactions__actor__display_name"),
),
),
),
default=None,
output_field=JSONField(),
),
filter=Case(
When(issue_reactions__isnull=False, then=True),
default=False,
output_field=JSONField(),
),
distinct=True,
),
).values(*required_fields, "vote_items", "reaction_items")
return issues
def issue_group_values(field, slug, project_id=None, filters=dict):
if field == "state_id":
queryset = State.objects.filter(
is_triage=False,
workspace__slug=slug,
).values_list("id", flat=True)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
if field == "labels__id":
queryset = Label.objects.filter(workspace__slug=slug).values_list(
"id", flat=True
)
if project_id:
return list(queryset.filter(project_id=project_id)) + ["None"]
else:
return list(queryset) + ["None"]
if field == "assignees__id":
if project_id:
return ProjectMember.objects.filter(
workspace__slug=slug,
project_id=project_id,
is_active=True,
).values_list("member_id", flat=True)
else:
return list(
WorkspaceMember.objects.filter(
workspace__slug=slug, is_active=True
).values_list("member_id", flat=True)
)
if field == "issue_module__module_id":
queryset = Module.objects.filter(
workspace__slug=slug,
).values_list("id", flat=True)
if project_id:
return list(queryset.filter(project_id=project_id)) + ["None"]
else:
return list(queryset) + ["None"]
if field == "cycle_id":
queryset = Cycle.objects.filter(
workspace__slug=slug,
).values_list("id", flat=True)
if project_id:
return list(queryset.filter(project_id=project_id)) + ["None"]
else:
return list(queryset) + ["None"]
if field == "project_id":
queryset = Project.objects.filter(workspace__slug=slug).values_list(
"id", flat=True
)
return list(queryset)
if field == "priority":
return [
"low",
"medium",
"high",
"urgent",
"none",
]
if field == "state__group":
return [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
if field == "target_date":
queryset = (
Issue.issue_objects.filter(workspace__slug=slug)
.filter(**filters)
.values_list("target_date", flat=True)
.distinct()
)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
if field == "start_date":
queryset = (
Issue.issue_objects.filter(workspace__slug=slug)
.filter(**filters)
.values_list("start_date", flat=True)
.distinct()
)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
if field == "created_by":
queryset = (
Issue.issue_objects.filter(workspace__slug=slug)
.filter(**filters)
.values_list("created_by", flat=True)
.distinct()
)
if project_id:
return list(queryset.filter(project_id=project_id))
else:
return list(queryset)
return []

View File

@@ -2,6 +2,7 @@ from .project import (
ProjectDeployBoardPublicSettingsEndpoint,
WorkspaceProjectDeployBoardEndpoint,
WorkspaceProjectAnchorEndpoint,
ProjectMembersEndpoint,
)
from .issue import (
@@ -14,3 +15,11 @@ from .issue import (
)
from .inbox import InboxIssuePublicViewSet
from .cycle import ProjectCyclesEndpoint
from .module import ProjectModulesEndpoint
from .state import ProjectStatesEndpoint
from .label import ProjectLabelsEndpoint

View File

@@ -0,0 +1,35 @@
# Third Party imports
from rest_framework import status
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
# Module imports
from .base import BaseAPIView
from plane.db.models import (
DeployBoard,
Cycle,
)
class ProjectCyclesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
deploy_board = DeployBoard.objects.filter(anchor=anchor).first()
if not deploy_board:
return Response(
{"error": "Invalid anchor"},
status=status.HTTP_404_NOT_FOUND,
)
cycles = Cycle.objects.filter(
workspace__slug=deploy_board.workspace.slug,
project_id=deploy_board.project_id,
).values("id", "name")
return Response(
cycles,
status=status.HTTP_200_OK,
)

View File

@@ -1,20 +1,19 @@
# Python imports
import json
from django.contrib.postgres.aggregates import ArrayAgg
from django.contrib.postgres.fields import ArrayField
from django.db.models.functions import Coalesce, JSONObject
from django.core.serializers.json import DjangoJSONEncoder
from django.db.models import (
Exists,
F,
Func,
OuterRef,
Q,
Prefetch,
UUIDField,
Case,
When,
CharField,
IntegerField,
JSONField,
Value,
Max,
)
# Django imports
@@ -25,6 +24,22 @@ from rest_framework.permissions import AllowAny, IsAuthenticated
# Third Party imports
from rest_framework.response import Response
# Module imports
from .base import BaseAPIView, BaseViewSet
# fetch the space app grouper function separately
from plane.space.utils.grouper import (
issue_group_values,
issue_on_results,
issue_queryset_grouper,
)
from plane.utils.order_queryset import order_issue_queryset
from plane.utils.paginator import (
GroupedOffsetPaginator,
SubGroupedOffsetPaginator,
)
from plane.app.serializers import (
CommentReactionSerializer,
IssueCommentSerializer,
@@ -36,21 +51,160 @@ from plane.db.models import (
Issue,
IssueComment,
IssueLink,
IssueAttachment,
ProjectMember,
IssueReaction,
ProjectMember,
CommentReaction,
DeployBoard,
IssueVote,
ProjectPublicMember,
State,
Label,
)
from plane.bgtasks.issue_activites_task import issue_activity
from plane.utils.issue_filters import issue_filters
# Module imports
from .base import BaseAPIView, BaseViewSet
class ProjectIssuesPublicEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
filters = issue_filters(request.query_params, "GET")
order_by_param = request.GET.get("order_by", "-created_at")
deploy_board = DeployBoard.objects.filter(
anchor=anchor, entity_name="project"
).first()
if not deploy_board:
return Response(
{"error": "Project is not published"},
status=status.HTTP_404_NOT_FOUND,
)
project_id = deploy_board.entity_identifier
slug = deploy_board.workspace.slug
issue_queryset = (
Issue.issue_objects.filter(
workspace__slug=slug, project_id=project_id
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
)
)
.prefetch_related(
Prefetch(
"votes",
queryset=IssueVote.objects.select_related("actor"),
)
)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
).distinct()
issue_queryset = issue_queryset.filter(**filters)
# Issue queryset
issue_queryset, order_by_param = order_issue_queryset(
issue_queryset=issue_queryset,
order_by_param=order_by_param,
)
# Group by
group_by = request.GET.get("group_by", False)
sub_group_by = request.GET.get("sub_group_by", False)
# issue queryset
issue_queryset = issue_queryset_grouper(
queryset=issue_queryset,
group_by=group_by,
sub_group_by=sub_group_by,
)
if group_by:
if sub_group_by:
if group_by == sub_group_by:
return Response(
{
"error": "Group by and sub group by cannot have same parameters"
},
status=status.HTTP_400_BAD_REQUEST,
)
else:
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=SubGroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
sub_group_by_fields=issue_group_values(
field=sub_group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
sub_group_by_field_name=sub_group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
# Group paginate
return self.paginate(
request=request,
order_by=order_by_param,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
paginator_cls=GroupedOffsetPaginator,
group_by_fields=issue_group_values(
field=group_by,
slug=slug,
project_id=project_id,
filters=filters,
),
group_by_field_name=group_by,
count_filter=Q(
Q(issue_inbox__status=1)
| Q(issue_inbox__status=-1)
| Q(issue_inbox__status=2)
| Q(issue_inbox__isnull=True),
archived_at__isnull=True,
is_draft=False,
),
)
else:
return self.paginate(
order_by=order_by_param,
request=request,
queryset=issue_queryset,
on_results=lambda issues: issue_on_results(
group_by=group_by,
issues=issues,
sub_group_by=sub_group_by,
),
)
class IssueCommentPublicViewSet(BaseViewSet):
@@ -503,67 +657,50 @@ class IssueRetrievePublicEndpoint(BaseAPIView):
]
def get(self, request, anchor, issue_id):
project_deploy_board = DeployBoard.objects.get(
anchor=anchor, entity_name="project"
)
issue = Issue.objects.get(
workspace_id=project_deploy_board.workspace_id,
project_id=project_deploy_board.project_id,
pk=issue_id,
)
serializer = IssuePublicSerializer(issue)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectIssuesPublicEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
deploy_board = DeployBoard.objects.filter(
anchor=anchor, entity_name="project"
).first()
if not deploy_board:
return Response(
{"error": "Project is not published"},
status=status.HTTP_404_NOT_FOUND,
)
filters = issue_filters(request.query_params, "GET")
# Custom ordering for priority and state
priority_order = ["urgent", "high", "medium", "low", "none"]
state_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
order_by_param = request.GET.get("order_by", "-created_at")
project_id = deploy_board.entity_identifier
slug = deploy_board.workspace.slug
deploy_board = DeployBoard.objects.get(anchor=anchor)
issue_queryset = (
Issue.issue_objects.annotate(
sub_issues_count=Issue.issue_objects.filter(
parent=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
Issue.issue_objects.filter(
pk=issue_id,
workspace__slug=deploy_board.workspace.slug,
project_id=deploy_board.project_id,
)
.select_related("workspace", "project", "state", "parent")
.prefetch_related("assignees", "labels", "issue_module__module")
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(
label_ids=Coalesce(
ArrayAgg(
"labels__id",
distinct=True,
filter=~Q(labels__id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
assignee_ids=Coalesce(
ArrayAgg(
"assignees__id",
distinct=True,
filter=~Q(assignees__id__isnull=True)
& Q(assignees__member_project__is_active=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
module_ids=Coalesce(
ArrayAgg(
"issue_module__module_id",
distinct=True,
filter=~Q(issue_module__module_id__isnull=True),
),
Value([], output_field=ArrayField(UUIDField())),
),
)
.filter(project_id=project_id)
.filter(workspace__slug=slug)
.select_related("project", "workspace", "state", "parent")
.prefetch_related("assignees", "labels")
.prefetch_related(
Prefetch(
"issue_reactions",
queryset=IssueReaction.objects.select_related("actor"),
queryset=IssueReaction.objects.select_related(
"issue", "actor"
),
)
)
.prefetch_related(
@@ -572,124 +709,91 @@ class ProjectIssuesPublicEndpoint(BaseAPIView):
queryset=IssueVote.objects.select_related("actor"),
)
)
.filter(**filters)
.annotate(cycle_id=F("issue_cycle__cycle_id"))
.annotate(module_id=F("issue_module__module_id"))
.annotate(
link_count=IssueLink.objects.filter(issue=OuterRef("id"))
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
.annotate(
attachment_count=IssueAttachment.objects.filter(
issue=OuterRef("id")
)
.order_by()
.annotate(count=Func(F("id"), function="Count"))
.values("count")
)
)
# Priority Ordering
if order_by_param == "priority" or order_by_param == "-priority":
priority_order = (
priority_order
if order_by_param == "priority"
else priority_order[::-1]
)
issue_queryset = issue_queryset.annotate(
priority_order=Case(
*[
When(priority=p, then=Value(i))
for i, p in enumerate(priority_order)
],
output_field=CharField(),
)
).order_by("priority_order")
# State Ordering
elif order_by_param in [
"state__name",
"state__group",
"-state__name",
"-state__group",
]:
state_order = (
state_order
if order_by_param in ["state__name", "state__group"]
else state_order[::-1]
)
issue_queryset = issue_queryset.annotate(
state_order=Case(
*[
When(state__group=state_group, then=Value(i))
for i, state_group in enumerate(state_order)
],
default=Value(len(state_order)),
output_field=CharField(),
)
).order_by("state_order")
# assignee and label ordering
elif order_by_param in [
"labels__name",
"-labels__name",
"assignees__first_name",
"-assignees__first_name",
]:
issue_queryset = issue_queryset.annotate(
max_values=Max(
order_by_param[1::]
if order_by_param.startswith("-")
else order_by_param
)
).order_by(
"-max_values"
if order_by_param.startswith("-")
else "max_values"
)
else:
issue_queryset = issue_queryset.order_by(order_by_param)
issues = IssuePublicSerializer(issue_queryset, many=True).data
state_group_order = [
"backlog",
"unstarted",
"started",
"completed",
"cancelled",
]
states = (
State.objects.filter(
~Q(name="Triage"),
workspace__slug=slug,
project_id=project_id,
)
.annotate(
custom_order=Case(
*[
When(group=value, then=Value(index))
for index, value in enumerate(state_group_order)
],
default=Value(len(state_group_order)),
output_field=IntegerField(),
vote_items=ArrayAgg(
Case(
When(
votes__isnull=False,
then=JSONObject(
vote=F("votes__vote"),
actor_details=JSONObject(
id=F("votes__actor__id"),
first_name=F("votes__actor__first_name"),
last_name=F("votes__actor__last_name"),
avatar=F("votes__actor__avatar"),
display_name=F(
"votes__actor__display_name"
),
),
),
),
default=None,
output_field=JSONField(),
),
filter=Case(
When(votes__isnull=False, then=True),
default=False,
output_field=JSONField(),
),
distinct=True,
),
reaction_items=ArrayAgg(
Case(
When(
issue_reactions__isnull=False,
then=JSONObject(
reaction=F("issue_reactions__reaction"),
actor_details=JSONObject(
id=F("issue_reactions__actor__id"),
first_name=F(
"issue_reactions__actor__first_name"
),
last_name=F(
"issue_reactions__actor__last_name"
),
avatar=F("issue_reactions__actor__avatar"),
display_name=F(
"issue_reactions__actor__display_name"
),
),
),
),
default=None,
output_field=JSONField(),
),
filter=Case(
When(issue_reactions__isnull=False, then=True),
default=False,
output_field=JSONField(),
),
distinct=True,
),
)
.values("name", "group", "color", "id")
.order_by("custom_order", "sequence")
)
.values(
"id",
"name",
"state_id",
"sort_order",
"description",
"description_html",
"description_stripped",
"description_binary",
"module_ids",
"label_ids",
"assignee_ids",
"estimate_point",
"priority",
"start_date",
"target_date",
"sequence_id",
"project_id",
"parent_id",
"cycle_id",
"created_by",
"state__group",
"vote_items",
"reaction_items",
)
).first()
labels = Label.objects.filter(
workspace__slug=slug, project_id=project_id
).values("id", "name", "color", "parent")
return Response(
{
"issues": issues,
"states": states,
"labels": labels,
},
status=status.HTTP_200_OK,
)
return Response(issue_queryset, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,35 @@
# Third Party imports
from rest_framework.response import Response
from rest_framework import status
from rest_framework.permissions import AllowAny
# Module imports
from .base import BaseAPIView
from plane.db.models import (
DeployBoard,
Label,
)
class ProjectLabelsEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
deploy_board = DeployBoard.objects.filter(anchor=anchor).first()
if not deploy_board:
return Response(
{"error": "Invalid anchor"},
status=status.HTTP_404_NOT_FOUND,
)
labels = Label.objects.filter(
workspace__slug=deploy_board.workspace.slug,
project_id=deploy_board.project_id,
).values("id", "name", "color", "parent")
return Response(
labels,
status=status.HTTP_200_OK,
)

View File

@@ -0,0 +1,35 @@
# Third Party imports
from rest_framework import status
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
# Module imports
from .base import BaseAPIView
from plane.db.models import (
DeployBoard,
Module,
)
class ProjectModulesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
deploy_board = DeployBoard.objects.filter(anchor=anchor).first()
if not deploy_board:
return Response(
{"error": "Invalid anchor"},
status=status.HTTP_404_NOT_FOUND,
)
modules = Module.objects.filter(
workspace__slug=deploy_board.workspace.slug,
project_id=deploy_board.project_id,
).values("id", "name")
return Response(
modules,
status=status.HTTP_200_OK,
)

View File

@@ -12,10 +12,7 @@ from rest_framework.permissions import AllowAny
# Module imports
from .base import BaseAPIView
from plane.app.serializers import DeployBoardSerializer
from plane.db.models import (
Project,
DeployBoard,
)
from plane.db.models import Project, DeployBoard, ProjectMember
class ProjectDeployBoardPublicSettingsEndpoint(BaseAPIView):
@@ -76,3 +73,27 @@ class WorkspaceProjectAnchorEndpoint(BaseAPIView):
)
serializer = DeployBoardSerializer(project_deploy_board)
return Response(serializer.data, status=status.HTTP_200_OK)
class ProjectMembersEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
deploy_board = DeployBoard.objects.filter(anchor=anchor).first()
members = ProjectMember.objects.filter(
project=deploy_board.project,
workspace=deploy_board.workspace,
is_active=True,
).values(
"id",
"member",
"member__first_name",
"member__last_name",
"member__display_name",
"project",
"workspace",
)
return Response(members, status=status.HTTP_200_OK)

View File

@@ -0,0 +1,42 @@
# Django imports
from django.db.models import Q
# Third Party imports
from rest_framework import status
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
# Module imports
from .base import BaseAPIView
from plane.db.models import (
DeployBoard,
State,
)
class ProjectStatesEndpoint(BaseAPIView):
permission_classes = [
AllowAny,
]
def get(self, request, anchor):
deploy_board = DeployBoard.objects.filter(anchor=anchor).first()
if not deploy_board:
return Response(
{"error": "Invalid anchor"},
status=status.HTTP_404_NOT_FOUND,
)
states = (
State.objects.filter(
~Q(name="Triage"),
workspace__slug=deploy_board.workspace.slug,
project_id=deploy_board.project_id,
)
.values("name", "group", "color", "id")
)
return Response(
states,
status=status.HTTP_200_OK,
)

View File

@@ -12,7 +12,7 @@ from django.db.models import (
Sum,
Value,
When,
IntegerField,
FloatField,
)
from django.db.models.functions import (
Coalesce,
@@ -98,7 +98,7 @@ def build_graph_plot(queryset, x_axis, y_axis, segment=None):
# Estimate
else:
queryset = queryset.annotate(
estimate=Sum(Cast("estimate_point__value", IntegerField()))
estimate=Sum(Cast("estimate_point__value", FloatField()))
).order_by(x_axis)
queryset = (
queryset.annotate(segment=F(segment)) if segment else queryset
@@ -137,7 +137,7 @@ def burndown_plot(
estimate__isnull=False,
estimate__type="points",
).exists()
if estimate_type and plot_type == "points":
if estimate_type and plot_type == "points" and cycle_id:
issue_estimates = Issue.objects.filter(
workspace__slug=slug,
project_id=project_id,
@@ -145,7 +145,18 @@ def burndown_plot(
estimate_point__isnull=False,
).values_list("estimate_point__value", flat=True)
issue_estimates = [int(value) for value in issue_estimates]
issue_estimates = [float(value) for value in issue_estimates]
total_estimate_points = sum(issue_estimates)
if estimate_type and plot_type == "points" and module_id:
issue_estimates = Issue.objects.filter(
workspace__slug=slug,
project_id=project_id,
issue_module__module_id=module_id,
estimate_point__isnull=False,
).values_list("estimate_point__value", flat=True)
issue_estimates = [float(value) for value in issue_estimates]
total_estimate_points = sum(issue_estimates)
if cycle_id:
@@ -227,12 +238,12 @@ def burndown_plot(
.order_by("date")
)
for date in date_range:
if plot_type == "points":
if plot_type == "points":
for date in date_range:
cumulative_pending_issues = total_estimate_points
total_completed = 0
total_completed = sum(
int(item["estimate_point__value"])
float(item["estimate_point__value"])
for item in completed_issues_estimate_point_distribution
if item["date"] is not None and item["date"] <= date
)
@@ -241,7 +252,8 @@ def burndown_plot(
chart_data[str(date)] = None
else:
chart_data[str(date)] = cumulative_pending_issues
else:
else:
for date in date_range:
cumulative_pending_issues = total_issues
total_completed = 0
total_completed = sum(

View File

@@ -0,0 +1,30 @@
RESTRICTED_WORKSPACE_SLUGS = [
"404",
"accounts",
"api",
"create-workspace",
"god-mode",
"installations",
"invitations",
"onboarding",
"profile",
"spaces",
"workspace-invitations",
"password",
"flags",
"monitor",
"monitoring",
"ingest",
"plane-pro",
"plane-ultimate",
"enterprise",
"plane-enterprise",
"disco",
"silo",
"chat",
"calendar",
"drive",
"channels",
"upgrade",
"billing",
]

View File

@@ -154,6 +154,13 @@ def filter_priority(params, filter, method, prefix=""):
]
if len(priorities) and "" not in priorities:
filter[f"{prefix}priority__in"] = priorities
else:
if (
params.get("priority", None)
and len(params.get("priority"))
and params.get("priority") != "null"
):
filter[f"{prefix}priority__in"] = params.get("priority")
return filter

View File

@@ -14,313 +14,43 @@ from rest_framework.response import Response
# Module imports
class Cursor:
# The cursor value
def __init__(self, value, offset=0, is_prev=False, has_results=None):
self.value = value
self.offset = int(offset)
self.is_prev = bool(is_prev)
self.has_results = has_results
class InitialGrouperPaginator(object):
"""This class generate the initial grouping of the queryset"""
# Return the cursor value in string format
def __str__(self):
return f"{self.value}:{self.offset}:{int(self.is_prev)}"
# Return the cursor value
def __eq__(self, other):
return all(
getattr(self, attr) == getattr(other, attr)
for attr in ("value", "offset", "is_prev", "has_results")
)
# Return the representation of the cursor
def __repr__(self):
return f"{type(self).__name__,}: value={self.value} offset={self.offset}, is_prev={int(self.is_prev)}"
# Return if the cursor is true
def __bool__(self):
return bool(self.has_results)
@classmethod
def from_string(cls, value):
"""Return the cursor value from string format"""
try:
bits = value.split(":")
if len(bits) != 3:
raise ValueError(
"Cursor must be in the format 'value:offset:is_prev'"
)
value = float(bits[0]) if "." in bits[0] else int(bits[0])
return cls(value, int(bits[1]), bool(int(bits[2])))
except (TypeError, ValueError) as e:
raise ValueError(f"Invalid cursor format: {e}")
class CursorResult(Sequence):
def __init__(self, results, next, prev, hits=None, max_hits=None):
self.results = results
self.next = next
self.prev = prev
self.hits = hits
self.max_hits = max_hits
def __len__(self):
# Return the length of the results
return len(self.results)
def __iter__(self):
# Return the iterator of the results
return iter(self.results)
def __getitem__(self, key):
# Return the results based on the key
return self.results[key]
def __repr__(self):
# Return the representation of the results
return f"<{type(self).__name__}: results={len(self.results)}>"
MAX_LIMIT = 100
class BadPaginationError(Exception):
pass
class OffsetPaginator:
"""
The Offset paginator using the offset and limit
with cursor controls
http://example.com/api/users/?cursor=10.0.0&per_page=10
cursor=limit,offset=page,
"""
def __init__(
self,
queryset,
order_by=None,
max_limit=MAX_LIMIT,
max_offset=None,
on_results=None,
):
# Key tuple and remove `-` if descending order by
self.key = (
order_by
if order_by is None or isinstance(order_by, (list, tuple, set))
else (order_by[1::] if order_by.startswith("-") else order_by,)
)
# Set desc to true when `-` exists in the order by
self.desc = True if order_by and order_by.startswith("-") else False
self.queryset = queryset
self.max_limit = max_limit
self.max_offset = max_offset
self.on_results = on_results
def get_result(self, limit=100, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
# Get the min from limit and max limit
limit = min(limit, self.max_limit)
# queryset
queryset = self.queryset
if self.key:
queryset = queryset.order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
"-created_at",
)
# The current page
page = cursor.offset
# The offset
offset = cursor.offset * cursor.value
stop = offset + (cursor.value or limit) + 1
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
results = queryset[offset:stop]
if cursor.value != limit:
results = results[-(limit + 1) :]
# Adjust cursors based on the results for pagination
next_cursor = Cursor(limit, page + 1, False, results.count() > limit)
# If the page is greater than 0, then set the previous cursor
prev_cursor = Cursor(limit, page - 1, True, page > 0)
# Process the results
results = results[:limit]
# Process the results
if self.on_results:
results = self.on_results(results)
# Count the queryset
count = queryset.count()
# Optionally, calculate the total count and max_hits if needed
max_hits = math.ceil(count / limit)
# Return the cursor results
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def process_results(self, results):
raise NotImplementedError
class GroupedOffsetPaginator(OffsetPaginator):
# Field mappers - list m2m fields here
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"issue_module__module_id": "module_ids",
}
max_limit = 50
def __init__(
self,
queryset,
group_by_field_name,
group_by_fields,
count_filter,
*args,
**kwargs,
):
# Initiate the parent class for all the parameters
super().__init__(queryset, *args, **kwargs)
count_queryset,
order_by=None,
) -> None:
# Set the queryset
self.queryset = queryset
# Set the group by field name
self.group_by_field_name = group_by_field_name
# Set the group by fields
self.group_by_fields = group_by_fields
# Set the count filter - this are extra filters that need to be passed to calculate the counts with the filters
self.count_filter = count_filter
def get_result(self, limit=50, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
limit = min(limit, self.max_limit)
# Adjust the initial offset and stop based on the cursor and limit
queryset = self.queryset
page = cursor.offset
offset = cursor.offset * cursor.value
stop = offset + (cursor.value or limit) + 1
# Check if the offset is greater than the max offset
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
# Check if the offset is less than 0
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
# Compute the results
results = {}
# Create window for all the groups
queryset = queryset.annotate(
row_number=Window(
expression=RowNumber(),
partition_by=[F(self.group_by_field_name)],
order_by=(
(
F(*self.key).desc(
nulls_last=True
) # order by desc if desc is set
if self.desc
else F(*self.key).asc(
nulls_last=True
) # Order by asc if set
),
F("created_at").desc(),
),
)
)
# Filter the results by row number
results = queryset.filter(
row_number__gt=offset, row_number__lt=stop
).order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
F("created_at").desc(),
)
# Adjust cursors based on the grouped results for pagination
next_cursor = Cursor(
limit,
page + 1,
False,
queryset.filter(row_number__gte=stop).exists(),
)
# Add previous cursors
prev_cursor = Cursor(
limit,
page - 1,
True,
page > 0,
)
# Count the queryset
count = queryset.count()
# Optionally, calculate the total count and max_hits if needed
# This might require adjustments based on specific use cases
if results:
max_hits = math.ceil(
queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.order_by("-count")[0]["count"]
/ limit
)
else:
max_hits = 0
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
# Set the count queryset
self.count_queryset = count_queryset
# Set the key
self.desc = True if order_by and order_by.startswith("-") else False
# Key tuple and remove `-` if descending order by
self.key = (
order_by
if order_by is None or isinstance(order_by, (list, tuple, set))
else (order_by[1::] if order_by.startswith("-") else order_by,)
)
def __get_total_queryset(self):
# Get total items for each group
return (
self.queryset.values(self.group_by_field_name)
self.count_queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
@@ -395,6 +125,7 @@ class GroupedOffsetPaginator(OffsetPaginator):
for group_id, issues in grouped_by_field_name.items()
}
# Return the processed results
return processed_results
def __query_grouper(self, results):
@@ -404,105 +135,37 @@ class GroupedOffsetPaginator(OffsetPaginator):
group_value = str(result.get(self.group_by_field_name))
if group_value in processed_results:
processed_results[str(group_value)]["results"].append(result)
# Return the processed results
return processed_results
def process_results(self, results):
# Process results
if results:
if self.group_by_field_name in self.FIELD_MAPPER:
processed_results = self.__query_multi_grouper(results=results)
else:
processed_results = self.__query_grouper(results=results)
else:
processed_results = {}
return processed_results
class SubGroupedOffsetPaginator(OffsetPaginator):
# Field mappers this are the fields that are m2m
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"issue_module__module_id": "module_ids",
}
def __init__(
self,
queryset,
group_by_field_name,
sub_group_by_field_name,
group_by_fields,
sub_group_by_fields,
count_filter,
*args,
**kwargs,
):
# Initiate the parent class for all the parameters
super().__init__(queryset, *args, **kwargs)
# Set the group by field name
self.group_by_field_name = group_by_field_name
self.group_by_fields = group_by_fields
# Set the sub group by field name
self.sub_group_by_field_name = sub_group_by_field_name
self.sub_group_by_fields = sub_group_by_fields
# Set the count filter - this are extra filters that need to be passed to calculate the counts with the filters
self.count_filter = count_filter
def get_result(self, limit=30, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
# get the minimum value
# Get the results
def get_result(self, limit=50, is_multi_grouper=False, on_results=None):
# Get the min from limit and max limit
limit = min(limit, self.max_limit)
# Adjust the initial offset and stop based on the cursor and limit
# Get the queryset
queryset = self.queryset
# the current page
page = cursor.offset
# the offset
offset = cursor.offset * cursor.value
# the stop
stop = offset + (cursor.value or limit) + 1
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
# Compute the results
results = {}
# Create windows for group and sub group field name
# Create window for all the groups
queryset = queryset.annotate(
row_number=Window(
expression=RowNumber(),
partition_by=[
F(self.group_by_field_name),
F(self.sub_group_by_field_name),
],
partition_by=[F(self.group_by_field_name)],
order_by=(
(
F(*self.key).desc(nulls_last=True)
F(*self.key).desc(
nulls_last=True
) # order by desc if desc is set
if self.desc
else F(*self.key).asc(nulls_last=True)
else F(*self.key).asc(
nulls_last=True
) # Order by asc if set
),
"-created_at",
F("created_at").desc(),
),
)
)
# Filter the results
results = queryset.filter(
row_number__gt=offset, row_number__lt=stop
).order_by(
# Filter the results by row number
results = queryset.filter(row_number__lt=limit + 1).order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
@@ -511,318 +174,24 @@ class SubGroupedOffsetPaginator(OffsetPaginator):
F("created_at").desc(),
)
# Adjust cursors based on the grouped results for pagination
next_cursor = Cursor(
limit,
page + 1,
False,
queryset.filter(row_number__gte=stop).exists(),
)
# Process the results
if on_results:
results = on_results(results)
# Add previous cursors
prev_cursor = Cursor(
limit,
page - 1,
True,
page > 0,
)
# Count the queryset
count = queryset.count()
# Optionally, calculate the total count and max_hits if needed
# This might require adjustments based on specific use cases
# Process results
if results:
max_hits = math.ceil(
queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.order_by("-count")[0]["count"]
/ limit
)
else:
max_hits = 0
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def __get_group_total_queryset(self):
# Get group totals
return (
self.queryset.order_by(self.group_by_field_name)
.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.distinct()
)
def __get_subgroup_total_queryset(self):
# Get subgroup totals
return (
self.queryset.values(
self.group_by_field_name, self.sub_group_by_field_name
)
.annotate(
count=Count("id", filter=self.count_filter, distinct=True)
)
.order_by()
.values(
self.group_by_field_name, self.sub_group_by_field_name, "count"
)
)
def __get_total_dict(self):
# Use the above to convert to dictionary of 2D objects
total_group_dict = {}
total_sub_group_dict = {}
for group in self.__get_group_total_queryset():
total_group_dict[str(group.get(self.group_by_field_name))] = (
total_group_dict.get(
str(group.get(self.group_by_field_name)), 0
)
+ (1 if group.get("count") == 0 else group.get("count"))
)
# Sub group total values
for item in self.__get_subgroup_total_queryset():
group = str(item[self.group_by_field_name])
subgroup = str(item[self.sub_group_by_field_name])
count = item["count"]
# Create a dictionary of group and sub group
if group not in total_sub_group_dict:
total_sub_group_dict[str(group)] = {}
# Create a dictionary of sub group
if subgroup not in total_sub_group_dict[group]:
total_sub_group_dict[str(group)][str(subgroup)] = {}
# Create a nested dictionary of group and sub group
total_sub_group_dict[group][subgroup] = count
return total_group_dict, total_sub_group_dict
def __get_field_dict(self):
# Create a field dictionary
total_group_dict, total_sub_group_dict = self.__get_total_dict()
# Create a dictionary of group and sub group
return {
str(group): {
"results": {
str(sub_group): {
"results": [],
"total_results": total_sub_group_dict.get(
str(group)
).get(str(sub_group), 0),
}
for sub_group in total_sub_group_dict.get(str(group), [])
},
"total_results": total_group_dict.get(str(group), 0),
}
for group in self.group_by_fields
}
def __query_multi_grouper(self, results):
# Multi grouper
processed_results = self.__get_field_dict()
# Preparing a dict to keep track of group IDs associated with each label ID
result_group_mapping = defaultdict(set)
result_sub_group_mapping = defaultdict(set)
# Iterate over results to fill the above dictionaries
if self.group_by_field_name in self.FIELD_MAPPER:
for result in results:
result_id = result["id"]
group_id = result[self.group_by_field_name]
result_group_mapping[str(result_id)].add(str(group_id))
# Use the same calculation for the sub group
if self.sub_group_by_field_name in self.FIELD_MAPPER:
for result in results:
result_id = result["id"]
sub_group_id = result[self.sub_group_by_field_name]
result_sub_group_mapping[str(result_id)].add(str(sub_group_id))
# Iterate over results
for result in results:
# Get the group value
group_value = str(result.get(self.group_by_field_name))
# Get the sub group value
sub_group_value = str(result.get(self.sub_group_by_field_name))
# Check if the group value is in the processed results
result_id = result["id"]
if (
group_value in processed_results
and sub_group_value
in processed_results[str(group_value)]["results"]
):
if self.group_by_field_name in self.FIELD_MAPPER:
# for multi grouper
group_ids = list(result_group_mapping[str(result_id)])
result[self.FIELD_MAPPER.get(self.group_by_field_name)] = (
[] if "None" in group_ids else group_ids
)
if self.sub_group_by_field_name in self.FIELD_MAPPER:
sub_group_ids = list(
result_sub_group_mapping[str(result_id)]
)
# for multi groups
result[
self.FIELD_MAPPER.get(self.sub_group_by_field_name)
] = ([] if "None" in sub_group_ids else sub_group_ids)
# If a result belongs to multiple groups, add it to each group
processed_results[str(group_value)]["results"][
str(sub_group_value)
]["results"].append(result)
return processed_results
def __query_grouper(self, results):
# Single grouper
processed_results = self.__get_field_dict()
for result in results:
group_value = str(result.get(self.group_by_field_name))
sub_group_value = str(result.get(self.sub_group_by_field_name))
processed_results[group_value]["results"][sub_group_value][
"results"
].append(result)
return processed_results
def process_results(self, results):
if results:
if (
self.group_by_field_name in self.FIELD_MAPPER
or self.sub_group_by_field_name in self.FIELD_MAPPER
):
# if the grouping is done through m2m then
# Check if the results are multi grouper
if is_multi_grouper:
processed_results = self.__query_multi_grouper(results=results)
else:
# group it directly
processed_results = self.__query_grouper(results=results)
else:
processed_results = {}
return processed_results
class BasePaginator:
"""BasePaginator class can be inherited by any View to return a paginated view"""
# cursor query parameter name
cursor_name = "cursor"
# get the per page parameter from request
def get_per_page(self, request, default_per_page=100, max_per_page=100):
try:
per_page = int(request.GET.get("per_page", default_per_page))
except ValueError:
raise ParseError(detail="Invalid per_page parameter.")
max_per_page = max(max_per_page, default_per_page)
if per_page > max_per_page:
raise ParseError(
detail=f"Invalid per_page value. Cannot exceed {max_per_page}."
)
return per_page
def paginate(
self,
request,
on_results=None,
paginator=None,
paginator_cls=OffsetPaginator,
default_per_page=100,
max_per_page=100,
cursor_cls=Cursor,
extra_stats=None,
controller=None,
group_by_field_name=None,
group_by_fields=None,
sub_group_by_field_name=None,
sub_group_by_fields=None,
count_filter=None,
**paginator_kwargs,
):
"""Paginate the request"""
per_page = self.get_per_page(request, default_per_page, max_per_page)
# Convert the cursor value to integer and float from string
input_cursor = None
try:
input_cursor = cursor_cls.from_string(
request.GET.get(self.cursor_name, f"{per_page}:0:0"),
)
except ValueError:
raise ParseError(detail="Invalid cursor parameter.")
if not paginator:
if group_by_field_name:
paginator_kwargs["group_by_field_name"] = group_by_field_name
paginator_kwargs["group_by_fields"] = group_by_fields
paginator_kwargs["count_filter"] = count_filter
if sub_group_by_field_name:
paginator_kwargs["sub_group_by_field_name"] = (
sub_group_by_field_name
)
paginator_kwargs["sub_group_by_fields"] = (
sub_group_by_fields
)
paginator = paginator_cls(**paginator_kwargs)
try:
cursor_result = paginator.get_result(
limit=per_page, cursor=input_cursor
)
except BadPaginationError:
raise ParseError(detail="Error in parsing")
if on_results:
results = on_results(cursor_result.results)
else:
results = cursor_result.results
if group_by_field_name:
results = paginator.process_results(results=results)
# Add Manipulation functions to the response
if controller is not None:
results = controller(results)
else:
results = results
# Return the response
response = Response(
{
"grouped_by": group_by_field_name,
"sub_grouped_by": sub_group_by_field_name,
"total_count": (cursor_result.hits),
"next_cursor": str(cursor_result.next),
"prev_cursor": str(cursor_result.prev),
"next_page_results": cursor_result.next.has_results,
"prev_page_results": cursor_result.prev.has_results,
"count": cursor_result.__len__(),
"total_pages": cursor_result.max_hits,
"total_results": cursor_result.hits,
"extra_stats": extra_stats,
"results": results,
"results": processed_results,
"total_count": self.count_queryset.count(),
}
)

View File

@@ -0,0 +1,3 @@
from .base import BasePaginator
from .group_paginator import GroupedOffsetPaginator
from .sub_group_paginator import SubGroupedOffsetPaginator

View File

@@ -0,0 +1,114 @@
# Third party imports
from rest_framework.exceptions import ParseError
from rest_framework.response import Response
# Module imports
from .list_paginator import OffsetPaginator
from .paginator_util import Cursor, BadPaginationError
class BasePaginator:
"""BasePaginator class can be inherited by any View to return a paginated view"""
# cursor query parameter name
cursor_name = "cursor"
# get the per page parameter from request
def get_per_page(self, request, default_per_page=100, max_per_page=100):
try:
per_page = int(request.GET.get("per_page", default_per_page))
except ValueError:
raise ParseError(detail="Invalid per_page parameter.")
max_per_page = max(max_per_page, default_per_page)
if per_page > max_per_page:
raise ParseError(
detail=f"Invalid per_page value. Cannot exceed {max_per_page}."
)
return per_page
def paginate(
self,
request,
on_results=None,
paginator=None,
paginator_cls=OffsetPaginator,
default_per_page=100,
max_per_page=100,
cursor_cls=Cursor,
extra_stats=None,
controller=None,
group_by_field_name=None,
group_by_fields=None,
sub_group_by_field_name=None,
sub_group_by_fields=None,
**paginator_kwargs,
):
"""Paginate the request"""
per_page = self.get_per_page(request, default_per_page, max_per_page)
# Convert the cursor value to integer and float from string
input_cursor = None
try:
input_cursor = cursor_cls.from_string(
request.GET.get(self.cursor_name, f"{per_page}:0:0"),
)
except ValueError:
raise ParseError(detail="Invalid cursor parameter.")
if not paginator:
if group_by_field_name:
paginator_kwargs["group_by_field_name"] = group_by_field_name
paginator_kwargs["group_by_fields"] = group_by_fields
if sub_group_by_field_name:
paginator_kwargs["sub_group_by_field_name"] = (
sub_group_by_field_name
)
paginator_kwargs["sub_group_by_fields"] = (
sub_group_by_fields
)
paginator = paginator_cls(**paginator_kwargs)
try:
cursor_result = paginator.get_result(
limit=per_page, cursor=input_cursor
)
except BadPaginationError:
raise ParseError(detail="Error in parsing")
if on_results:
results = on_results(cursor_result.results)
else:
results = cursor_result.results
if group_by_field_name:
results = paginator.process_results(results=results)
# Add Manipulation functions to the response
if controller is not None:
results = controller(results)
else:
results = results
# Return the response
response = Response(
{
"grouped_by": group_by_field_name,
"sub_grouped_by": sub_group_by_field_name,
"total_count": (cursor_result.hits),
"next_cursor": str(cursor_result.next),
"prev_cursor": str(cursor_result.prev),
"next_page_results": cursor_result.next.has_results,
"prev_page_results": cursor_result.prev.has_results,
"count": cursor_result.__len__(),
"total_pages": cursor_result.max_hits,
"total_results": cursor_result.hits,
"extra_stats": extra_stats,
"results": results,
}
)
return response

View File

@@ -0,0 +1,238 @@
# Python imports
import math
from collections import defaultdict
# Django imports
from django.db.models import Count, F, Window
from django.db.models.functions import RowNumber
# Module imports
from .list_paginator import OffsetPaginator
from .paginator_util import Cursor, CursorResult, BadPaginationError
class GroupedOffsetPaginator(OffsetPaginator):
# Field mappers - list m2m fields here
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"issue_module__module_id": "module_ids",
}
def __init__(
self,
queryset,
group_by_field_name,
group_by_fields,
*args,
**kwargs,
):
# Initiate the parent class for all the parameters
super().__init__(queryset, *args, **kwargs)
# Set the group by field name
self.group_by_field_name = group_by_field_name
# Set the group by fields
self.group_by_fields = group_by_fields
def get_result(self, limit=50, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
limit = min(limit, self.max_limit)
# Adjust the initial offset and stop based on the cursor and limit
queryset = self.queryset
page = cursor.offset
offset = cursor.offset * cursor.value
stop = offset + (cursor.value or limit) + 1
# Check if the offset is greater than the max offset
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
# Check if the offset is less than 0
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
# Compute the results
results = {}
# Create window for all the groups
queryset = queryset.annotate(
row_number=Window(
expression=RowNumber(),
partition_by=[F(self.group_by_field_name)],
order_by=(
(
F(*self.key).desc(
nulls_last=True
) # order by desc if desc is set
if self.desc
else F(*self.key).asc(
nulls_last=True
) # Order by asc if set
),
F("created_at").desc(),
),
)
)
# Filter the results by row number
results = queryset.filter(
row_number__gt=offset, row_number__lt=stop
).order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
F("created_at").desc(),
)
# Adjust cursors based on the grouped results for pagination
next_cursor = Cursor(
limit,
page + 1,
False,
queryset.filter(row_number__gte=stop).exists(),
)
# Add previous cursors
prev_cursor = Cursor(
limit,
page - 1,
True,
page > 0,
)
# Count the queryset
count = self.count_queryset.count()
# Optionally, calculate the total count and max_hits if needed
# This might require adjustments based on specific use cases
if results:
max_hits = math.ceil(
self.count_queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
distinct=True,
)
)
.order_by("-count")[0]["count"]
/ limit
)
else:
max_hits = 0
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def __get_total_queryset(self):
# Get total items for each group
return (
self.count_queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
distinct=True,
)
)
.order_by()
)
def __get_total_dict(self):
# Convert the total into dictionary of keys as group name and value as the total
total_group_dict = {}
for group in self.__get_total_queryset():
total_group_dict[str(group.get(self.group_by_field_name))] = (
total_group_dict.get(
str(group.get(self.group_by_field_name)), 0
)
+ (1 if group.get("count") == 0 else group.get("count"))
)
return total_group_dict
def __get_field_dict(self):
# Create a field dictionary
total_group_dict = self.__get_total_dict()
return {
str(field): {
"results": [],
"total_results": total_group_dict.get(str(field), 0),
}
for field in self.group_by_fields
}
def __result_already_added(self, result, group):
# Check if the result is already added then add it
for existing_issue in group:
if existing_issue["id"] == result["id"]:
return True
return False
def __query_multi_grouper(self, results):
# Grouping for m2m values
total_group_dict = self.__get_total_dict()
# Preparing a dict to keep track of group IDs associated with each entity ID
result_group_mapping = defaultdict(set)
# Preparing a dict to group result by group ID
grouped_by_field_name = defaultdict(list)
# Iterate over results to fill the above dictionaries
for result in results:
result_id = result["id"]
group_id = result[self.group_by_field_name]
result_group_mapping[str(result_id)].add(str(group_id))
# Adding group_ids key to each issue and grouping by group_name
for result in results:
result_id = result["id"]
group_ids = list(result_group_mapping[str(result_id)])
result[self.FIELD_MAPPER.get(self.group_by_field_name)] = (
[] if "None" in group_ids else group_ids
)
# If a result belongs to multiple groups, add it to each group
for group_id in group_ids:
if not self.__result_already_added(
result, grouped_by_field_name[group_id]
):
grouped_by_field_name[group_id].append(result)
# Convert grouped_by_field_name back to a list for each group
processed_results = {
str(group_id): {
"results": issues,
"total_results": total_group_dict.get(str(group_id)),
}
for group_id, issues in grouped_by_field_name.items()
}
return processed_results
def __query_grouper(self, results):
# Grouping for values that are not m2m
processed_results = self.__get_field_dict()
for result in results:
group_value = str(result.get(self.group_by_field_name))
if group_value in processed_results:
processed_results[str(group_value)]["results"].append(result)
return processed_results
def process_results(self, results):
# Process results
if results:
if self.group_by_field_name in self.FIELD_MAPPER:
processed_results = self.__query_multi_grouper(results=results)
else:
processed_results = self.__query_grouper(results=results)
else:
processed_results = {}
return processed_results

View File

@@ -0,0 +1,114 @@
# Python imports
import math
# Django imports
from django.db.models import F
# Module imports
from .paginator_util import Cursor, CursorResult, BadPaginationError, MAX_LIMIT
class OffsetPaginator:
"""
The Offset paginator using the offset and limit
with cursor controls
http://example.com/api/users/?cursor=10.0.0&per_page=10
cursor=limit,offset=page,
"""
def __init__(
self,
queryset,
count_queryset,
order_by=None,
max_limit=MAX_LIMIT,
max_offset=None,
on_results=None,
):
# Key tuple and remove `-` if descending order by
self.key = (
order_by
if order_by is None or isinstance(order_by, (list, tuple, set))
else (order_by[1::] if order_by.startswith("-") else order_by,)
)
# Set desc to true when `-` exists in the order by
self.desc = True if order_by and order_by.startswith("-") else False
# Set the queryset
self.queryset = queryset
# Set the max limit
self.max_limit = max_limit
# Set the max offset
self.max_offset = max_offset
# Set the on results
self.on_results = on_results
# Set the count queryset
self.count_queryset = count_queryset
def get_result(self, limit=100, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
# Get the min from limit and max limit
limit = min(limit, self.max_limit)
# queryset
queryset = self.queryset
# Order the queryset
if self.key:
queryset = queryset.order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
"-created_at",
)
# The current page
page = cursor.offset
# The offset
offset = cursor.offset * cursor.value
# The stop
stop = offset + (cursor.value or limit) + 1
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
# Compute the results
results = queryset[offset:stop]
if cursor.value != limit:
results = results[-(limit + 1) :]
# Adjust cursors based on the results for pagination
next_cursor = Cursor(limit, page + 1, False, results.count() > limit)
# If the page is greater than 0, then set the previous cursor
prev_cursor = Cursor(limit, page - 1, True, page > 0)
# Process the results
results = results[:limit]
# Process the results
if self.on_results:
results = self.on_results(results)
# Count the queryset
count = self.count_queryset.count()
# Optionally, calculate the total count and max_hits if needed
max_hits = math.ceil(count / limit)
# Return the cursor results
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def process_results(self, results):
raise NotImplementedError

View File

@@ -0,0 +1,77 @@
# Python imports
from collections.abc import Sequence
MAX_LIMIT = 100
class BadPaginationError(Exception):
pass
class Cursor:
# The cursor value
def __init__(self, value, offset=0, is_prev=False, has_results=None):
self.value = value
self.offset = int(offset)
self.is_prev = bool(is_prev)
self.has_results = has_results
# Return the cursor value in string format
def __str__(self):
return f"{self.value}:{self.offset}:{int(self.is_prev)}"
# Return the cursor value
def __eq__(self, other):
return all(
getattr(self, attr) == getattr(other, attr)
for attr in ("value", "offset", "is_prev", "has_results")
)
# Return the representation of the cursor
def __repr__(self):
return f"{type(self).__name__,}: value={self.value} offset={self.offset}, is_prev={int(self.is_prev)}"
# Return if the cursor is true
def __bool__(self):
return bool(self.has_results)
@classmethod
def from_string(cls, value):
"""Return the cursor value from string format"""
try:
bits = value.split(":")
if len(bits) != 3:
raise ValueError(
"Cursor must be in the format 'value:offset:is_prev'"
)
value = float(bits[0]) if "." in bits[0] else int(bits[0])
return cls(value, int(bits[1]), bool(int(bits[2])))
except (TypeError, ValueError) as e:
raise ValueError(f"Invalid cursor format: {e}")
class CursorResult(Sequence):
def __init__(self, results, next, prev, hits=None, max_hits=None):
self.results = results
self.next = next
self.prev = prev
self.hits = hits
self.max_hits = max_hits
def __len__(self):
# Return the length of the results
return len(self.results)
def __iter__(self):
# Return the iterator of the results
return iter(self.results)
def __getitem__(self, key):
# Return the results based on the key
return self.results[key]
def __repr__(self):
# Return the representation of the results
return f"<{type(self).__name__}: results={len(self.results)}>"

View File

@@ -0,0 +1,310 @@
# Python imports
import math
from collections import defaultdict
# Django imports
from django.db.models import Count, F, Window
from django.db.models.functions import RowNumber
# Module imports
from .list_paginator import OffsetPaginator
from .paginator_util import Cursor, CursorResult, BadPaginationError
class SubGroupedOffsetPaginator(OffsetPaginator):
# Field mappers this are the fields that are m2m
FIELD_MAPPER = {
"labels__id": "label_ids",
"assignees__id": "assignee_ids",
"issue_module__module_id": "module_ids",
}
def __init__(
self,
queryset,
group_by_field_name,
sub_group_by_field_name,
group_by_fields,
sub_group_by_fields,
count_filter,
*args,
**kwargs,
):
# Initiate the parent class for all the parameters
super().__init__(queryset, *args, **kwargs)
# Set the group by field name
self.group_by_field_name = group_by_field_name
self.group_by_fields = group_by_fields
# Set the sub group by field name
self.sub_group_by_field_name = sub_group_by_field_name
self.sub_group_by_fields = sub_group_by_fields
def get_result(self, limit=30, cursor=None):
# offset is page #
# value is page limit
if cursor is None:
cursor = Cursor(0, 0, 0)
# get the minimum value
limit = min(limit, self.max_limit)
# Adjust the initial offset and stop based on the cursor and limit
queryset = self.queryset
# the current page
page = cursor.offset
# the offset
offset = cursor.offset * cursor.value
# the stop
stop = offset + (cursor.value or limit) + 1
if self.max_offset is not None and offset >= self.max_offset:
raise BadPaginationError("Pagination offset too large")
if offset < 0:
raise BadPaginationError("Pagination offset cannot be negative")
# Compute the results
results = {}
# Create windows for group and sub group field name
queryset = queryset.annotate(
row_number=Window(
expression=RowNumber(),
partition_by=[
F(self.group_by_field_name),
F(self.sub_group_by_field_name),
],
order_by=(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
"-created_at",
),
)
)
# Filter the results
results = queryset.filter(
row_number__gt=offset, row_number__lt=stop
).order_by(
(
F(*self.key).desc(nulls_last=True)
if self.desc
else F(*self.key).asc(nulls_last=True)
),
F("created_at").desc(),
)
# Adjust cursors based on the grouped results for pagination
next_cursor = Cursor(
limit,
page + 1,
False,
queryset.filter(row_number__gte=stop).exists(),
)
# Add previous cursors
prev_cursor = Cursor(
limit,
page - 1,
True,
page > 0,
)
# Count the queryset
count = self.count_queryset.count()
# Optionally, calculate the total count and max_hits if needed
# This might require adjustments based on specific use cases
if results:
max_hits = math.ceil(
self.count_queryset.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.order_by("-count")[0]["count"]
/ limit
)
else:
max_hits = 0
return CursorResult(
results=results,
next=next_cursor,
prev=prev_cursor,
hits=count,
max_hits=max_hits,
)
def __get_group_total_queryset(self):
# Get group totals
return (
self.count_queryset.order_by(self.group_by_field_name)
.values(self.group_by_field_name)
.annotate(
count=Count(
"id",
filter=self.count_filter,
distinct=True,
)
)
.distinct()
)
def __get_subgroup_total_queryset(self):
# Get subgroup totals
return (
self.count_queryset.values(
self.group_by_field_name, self.sub_group_by_field_name
)
.annotate(
count=Count("id", filter=self.count_filter, distinct=True)
)
.order_by()
.values(
self.group_by_field_name, self.sub_group_by_field_name, "count"
)
)
def __get_total_dict(self):
# Use the above to convert to dictionary of 2D objects
total_group_dict = {}
total_sub_group_dict = {}
for group in self.__get_group_total_queryset():
total_group_dict[str(group.get(self.group_by_field_name))] = (
total_group_dict.get(
str(group.get(self.group_by_field_name)), 0
)
+ (1 if group.get("count") == 0 else group.get("count"))
)
# Sub group total values
for item in self.__get_subgroup_total_queryset():
group = str(item[self.group_by_field_name])
subgroup = str(item[self.sub_group_by_field_name])
count = item["count"]
# Create a dictionary of group and sub group
if group not in total_sub_group_dict:
total_sub_group_dict[str(group)] = {}
# Create a dictionary of sub group
if subgroup not in total_sub_group_dict[group]:
total_sub_group_dict[str(group)][str(subgroup)] = {}
# Create a nested dictionary of group and sub group
total_sub_group_dict[group][subgroup] = count
return total_group_dict, total_sub_group_dict
def __get_field_dict(self):
# Create a field dictionary
total_group_dict, total_sub_group_dict = self.__get_total_dict()
# Create a dictionary of group and sub group
return {
str(group): {
"results": {
str(sub_group): {
"results": [],
"total_results": total_sub_group_dict.get(
str(group)
).get(str(sub_group), 0),
}
for sub_group in total_sub_group_dict.get(str(group), [])
},
"total_results": total_group_dict.get(str(group), 0),
}
for group in self.group_by_fields
}
def __query_multi_grouper(self, results):
# Multi grouper
processed_results = self.__get_field_dict()
# Preparing a dict to keep track of group IDs associated with each label ID
result_group_mapping = defaultdict(set)
result_sub_group_mapping = defaultdict(set)
# Iterate over results to fill the above dictionaries
if self.group_by_field_name in self.FIELD_MAPPER:
for result in results:
result_id = result["id"]
group_id = result[self.group_by_field_name]
result_group_mapping[str(result_id)].add(str(group_id))
# Use the same calculation for the sub group
if self.sub_group_by_field_name in self.FIELD_MAPPER:
for result in results:
result_id = result["id"]
sub_group_id = result[self.sub_group_by_field_name]
result_sub_group_mapping[str(result_id)].add(str(sub_group_id))
# Iterate over results
for result in results:
# Get the group value
group_value = str(result.get(self.group_by_field_name))
# Get the sub group value
sub_group_value = str(result.get(self.sub_group_by_field_name))
# Check if the group value is in the processed results
result_id = result["id"]
if (
group_value in processed_results
and sub_group_value
in processed_results[str(group_value)]["results"]
):
if self.group_by_field_name in self.FIELD_MAPPER:
# for multi grouper
group_ids = list(result_group_mapping[str(result_id)])
result[self.FIELD_MAPPER.get(self.group_by_field_name)] = (
[] if "None" in group_ids else group_ids
)
if self.sub_group_by_field_name in self.FIELD_MAPPER:
sub_group_ids = list(
result_sub_group_mapping[str(result_id)]
)
# for multi groups
result[
self.FIELD_MAPPER.get(self.sub_group_by_field_name)
] = ([] if "None" in sub_group_ids else sub_group_ids)
# If a result belongs to multiple groups, add it to each group
processed_results[str(group_value)]["results"][
str(sub_group_value)
]["results"].append(result)
return processed_results
def __query_grouper(self, results):
# Single grouper
processed_results = self.__get_field_dict()
for result in results:
group_value = str(result.get(self.group_by_field_name))
sub_group_value = str(result.get(self.sub_group_by_field_name))
processed_results[group_value]["results"][sub_group_value][
"results"
].append(result)
return processed_results
def process_results(self, results):
if results:
if (
self.group_by_field_name in self.FIELD_MAPPER
or self.sub_group_by_field_name in self.FIELD_MAPPER
):
# if the grouping is done through m2m then
processed_results = self.__query_multi_grouper(results=results)
else:
# group it directly
processed_results = self.__query_grouper(results=results)
else:
processed_results = {}
return processed_results

View File

@@ -1,7 +1,7 @@
# base requirements
# django
Django==4.2.11
Django==4.2.14
# rest framework
djangorestframework==3.15.2
# postgres
@@ -26,7 +26,7 @@ django-filter==24.2
# json model
jsonmodels==2.7.0
# sentry
sentry-sdk==2.0.1
sentry-sdk==2.8.0
# storage
django-storages==1.14.2
# user management

View File

@@ -1,6 +1,6 @@
{
"repository": "https://github.com/makeplane/plane.git",
"version": "0.21.0",
"version": "0.22.0",
"license": "AGPL-3.0",
"private": true,
"workspaces": [

View File

@@ -1 +1,2 @@
export * from "./auth";
export * from "./issue";

View File

@@ -0,0 +1,27 @@
export const ALL_ISSUES = "All Issues";
export enum EIssueGroupByToServerOptions {
"state" = "state_id",
"priority" = "priority",
"labels" = "labels__id",
"state_detail.group" = "state__group",
"assignees" = "assignees__id",
"cycle" = "cycle_id",
"module" = "issue_module__module_id",
"target_date" = "target_date",
"project" = "project_id",
"created_by" = "created_by",
}
export enum EServerGroupByToFilterOptions {
"state_id" = "state",
"priority" = "priority",
"labels__id" = "labels",
"state__group" = "state_group",
"assignees__id" = "assignees",
"cycle_id" = "cycle",
"issue_module__module_id" = "module",
"target_date" = "target_date",
"project_id" = "project",
"created_by" = "created_by",
}

View File

@@ -1,6 +1,6 @@
{
"name": "@plane/constants",
"version": "0.21.0",
"version": "0.22.0",
"private": true,
"main": "./index.ts"
}

View File

@@ -1,6 +1,6 @@
{
"name": "@plane/editor",
"version": "0.21.0",
"version": "0.22.0",
"description": "Core Editor that powers Plane",
"private": true,
"main": "./dist/index.mjs",

Some files were not shown because too many files have changed in this diff Show More